890 resultados para analytical methodologies
Resumo:
Biological systems exhibit rich and complex behavior through the orchestrated interplay of a large array of components. It is hypothesized that separable subsystems with some degree of functional autonomy exist; deciphering their independent behavior and functionality would greatly facilitate understanding the system as a whole. Discovering and analyzing such subsystems are hence pivotal problems in the quest to gain a quantitative understanding of complex biological systems. In this work, using approaches from machine learning, physics and graph theory, methods for the identification and analysis of such subsystems were developed. A novel methodology, based on a recent machine learning algorithm known as non-negative matrix factorization (NMF), was developed to discover such subsystems in a set of large-scale gene expression data. This set of subsystems was then used to predict functional relationships between genes, and this approach was shown to score significantly higher than conventional methods when benchmarking them against existing databases. Moreover, a mathematical treatment was developed to treat simple network subsystems based only on their topology (independent of particular parameter values). Application to a problem of experimental interest demonstrated the need for extentions to the conventional model to fully explain the experimental data. Finally, the notion of a subsystem was evaluated from a topological perspective. A number of different protein networks were examined to analyze their topological properties with respect to separability, seeking to find separable subsystems. These networks were shown to exhibit separability in a nonintuitive fashion, while the separable subsystems were of strong biological significance. It was demonstrated that the separability property found was not due to incomplete or biased data, but is likely to reflect biological structure.
Resumo:
Caches are known to consume up to half of all system power in embedded processors. Co-optimizing performance and power of the cache subsystems is therefore an important step in the design of embedded systems, especially those employing application specific instruction processors. In this project, we propose an analytical cache model that succinctly captures the miss performance of an application over the entire cache parameter space. Unlike exhaustive trace driven simulation, our model requires that the program be simulated once so that a few key characteristics can be obtained. Using these application-dependent characteristics, the model can span the entire cache parameter space consisting of cache sizes, associativity and cache block sizes. In our unified model, we are able to cater for direct-mapped, set and fully associative instruction, data and unified caches. Validation against full trace-driven simulations shows that our model has a high degree of fidelity. Finally, we show how the model can be coupled with a power model for caches such that one can very quickly decide on pareto-optimal performance-power design points for rapid design space exploration.
Resumo:
This paper analyzes a proposed release controlmethodology, WIPLOAD Control (WIPLCtrl), using a transfer line case modeled by Markov process modeling methodology. The performance of WIPLCtrl is compared with that of CONWIP under 13 system configurations in terms of throughput, average inventory level, as well as average cycle time. As a supplement to the analytical model, a simulation model of the transfer line is used to observe the performance of the release control methodologies on the standard deviation of cycle time. From the analysis, we identify the system configurations in which the advantages of WIPLCtrl could be observed.
Resumo:
Outlines of both Scrum and Dynamic Systems Development Model
Resumo:
En el mundo actual el enfoque del proceso estratégico es una variable que cada vez va tomando un mayor peso e incidencia en las decisiones de los directores de empresa, sin embargo, poco se ha estudiado sobre esta variable al ser adoptado por los nuevos modelos de cooperación empresarial en las organizaciones. Es por esta razón que esta investigación busca esclarecer la aplicabilidad de las metodologías y herramientas utilizadas en la planeación estratégica de una empresa en una red, además de servir como un primer acercamiento a los aspectos más importantes en la planeación estratégica de una red.
Resumo:
Este trabajo recopila literatura académica relevante sobre estrategias de entrada y metodologías para la toma de decisión sobre la contratación de servicios de Outsourcing para el caso de empresas que planean expandirse hacia mercados extranjeros. La manera en que una empresa planifica su entrada a un mercado extranjero, y realiza la consideración y evaluación de información relevante y el diseño de la estrategia, determina el éxito o no de la misma. De otro lado, las metodologías consideradas se concentran en el nivel estratégico de la pirámide organizacional. Se parte de métodos simples para llegar a aquellos basados en la Teoría de Decisión Multicriterio, tanto individuales como híbridos. Finalmente, se presenta la Dinámica de Sistemas como herramienta valiosa en el proceso, por cuanto puede combinarse con métodos multicriterio.
Resumo:
Based on the experiences of Colombia, Brazil and Bolivia, the paper proposes a general analytical framework for participatory mechanisms. The analysis is oriented to detect the incentives in each system and theethics and behavior sustaining them. It investigates about the sustainability of participatory democracy, in the face of tensions with representative democracy. The article presents a theoretical framework built from theseexperiences of institutional design and political practice, and confronts it against the theoretical conceptualizationsof participatory democracy in Bobbio, Sartori, Elster and Nino, among others. In this context, different waysin which those schemes can be inserted in the political systems become apparent, along with the variables thatresult from combining elements of direct, representative and participatory democracy”
Resumo:
Es necesario tomar en serio los procesos de educación rural en ambos continentes, para lo cual sería muy importante que, más allá de las políticas nacionales, pero sin prescindir de ellas, la Comunidad Internacional y sus grandes organizaciones, se empeñasen seriamente en afrontar los problemas de una planificación estratégica para la educación rural, sin excluir Asia y que movilizase todo tipo de voluntades y recursos. Para entender y afrontar la búsqueda de soluciones a la educación en el mundo fue trascendental el Forum Mundial sobre educación celebrado de Dakar. El objetivo de 155 países reunidos fue revisar los compromisos de la comunidad internacional para la reducción del analfabetismo y el acceso universal a una educación de calidad. La fecha fijada par ese gran cambio fue el año 2000. Aunque ya en la reunión de 1995 los gobiernos tuvieron que reconocer que no se caminaba a buen paso y la pospusieron para el 2015. sin embargo, los principales organismos reunidos en torno a la campaña mundial por la educación nos advierten de que si las cosas siguen así, llegaremos a esta última fecha igual ¿Qué ha pasado? ¿Es un problema de costes? No. Los gobiernos tienen o podrían tener dinero si corrigiesen algunos errores o tendencias perversas de los recursos suficientes para hacer frente al gasto que supone la educación básica universal. Esto es, el acceso a la escuela primaria de todos los menores entre seis y doce años y la alfabetización de los adultos. Pero también podría si se invierten las tendencias actuales, ir más allá y apostar por la formación generalizada, elemento esencial para el desarrollo duradero de la humanidad. Es una cuestión de voluntad política.
Resumo:
Monográfico con el título: 'Identidad y educación'. Resumen basado en el de la publicación
Resumo:
The characteristics of service independence and flexibility of ATM networks make the control problems of such networks very critical. One of the main challenges in ATM networks is to design traffic control mechanisms that enable both economically efficient use of the network resources and desired quality of service to higher layer applications. Window flow control mechanisms of traditional packet switched networks are not well suited to real time services, at the speeds envisaged for the future networks. In this work, the utilisation of the Probability of Congestion (PC) as a bandwidth decision parameter is presented. The validity of PC utilisation is compared with QOS parameters in buffer-less environments when only the cell loss ratio (CLR) parameter is relevant. The convolution algorithm is a good solution for CAC in ATM networks with small buffers. If the source characteristics are known, the actual CLR can be very well estimated. Furthermore, this estimation is always conservative, allowing the retention of the network performance guarantees. Several experiments have been carried out and investigated to explain the deviation between the proposed method and the simulation. Time parameters for burst length and different buffer sizes have been considered. Experiments to confine the limits of the burst length with respect to the buffer size conclude that a minimum buffer size is necessary to achieve adequate cell contention. Note that propagation delay is a no dismiss limit for long distance and interactive communications, then small buffer must be used in order to minimise delay. Under previous premises, the convolution approach is the most accurate method used in bandwidth allocation. This method gives enough accuracy in both homogeneous and heterogeneous networks. But, the convolution approach has a considerable computation cost and a high number of accumulated calculations. To overcome this drawbacks, a new method of evaluation is analysed: the Enhanced Convolution Approach (ECA). In ECA, traffic is grouped in classes of identical parameters. By using the multinomial distribution function instead of the formula-based convolution, a partial state corresponding to each class of traffic is obtained. Finally, the global state probabilities are evaluated by multi-convolution of the partial results. This method avoids accumulated calculations and saves storage requirements, specially in complex scenarios. Sorting is the dominant factor for the formula-based convolution, whereas cost evaluation is the dominant factor for the enhanced convolution. A set of cut-off mechanisms are introduced to reduce the complexity of the ECA evaluation. The ECA also computes the CLR for each j-class of traffic (CLRj), an expression for the CLRj evaluation is also presented. We can conclude that by combining the ECA method with cut-off mechanisms, utilisation of ECA in real-time CAC environments as a single level scheme is always possible.
Resumo:
This study investigates how children with cochlear implants from simultaneous communications backgrounds and from oral education backgrounds experience communication breakdowns. The study examines each group's response to communication breakdowns and the repair strategies of each group.
Resumo:
The common GIS-based approach to regional analyses of soil organic carbon (SOC) stocks and changes is to define geographic layers for which unique sets of driving variables are derived, which include land use, climate, and soils. These GIS layers, with their associated attribute data, can then be fed into a range of empirical and dynamic models. Common methodologies for collating and formatting regional data sets on land use, climate, and soils were adopted for the project Assessment of Soil Organic Carbon Stocks and Changes at National Scale (GEFSOC). This permitted the development of a uniform protocol for handling the various input for the dynamic GEFSOC Modelling System. Consistent soil data sets for Amazon-Brazil, the Indo-Gangetic Plains (IGP) of India, Jordan and Kenya, the case study areas considered in the GEFSOC project, were prepared using methodologies developed for the World Soils and Terrain Database (SOTER). The approach involved three main stages: (1) compiling new soil geographic and attribute data in SOTER format; (2) using expert estimates and common sense to fill selected gaps in the measured or primary data; (3) using a scheme of taxonomy-based pedotransfer rules and expert-rules to derive soil parameter estimates for similar soil units with missing soil analytical data. The most appropriate approach varied from country to country, depending largely on the overall accessibility and quality of the primary soil data available in the case study areas. The secondary SOTER data sets discussed here are appropriate for a wide range of environmental applications at national scale. These include agro-ecological zoning, land evaluation, modelling of soil C stocks and changes, and studies of soil vulnerability to pollution. Estimates of national-scale stocks of SOC, calculated using SOTER methods, are presented as a first example of database application. Independent estimates of SOC stocks are needed to evaluate the outcome of the GEFSOC Modelling System for current conditions of land use and climate. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
This study explores the way in which our picture of the Levantine Epipalaeolithic has been created, investigating the constructs that take us from found objects to coherent narrative about the world. Drawing on the treatment of chipped stone, the fundamental raw material of prehistoric narratives, it examines the use of figurative devices - of metaphor, metonymy, and synecdoche - to make the connection between the world and the words we need to describe it. The work of three researchers is explored in a case study of the Middle Epipalaeolithic with the aim of showing how different research goals and methodologies have created characteristics for the period that are so entrenched in discourse as to have become virtually invisible.Yet the definition of distinct cultures with long-lasting traditions, the identification of two separate ethnic trajectories linked to separate environmental zones, and the analysis of climate as the key driver of change all rest on analytical manoeuvres to transform objects into data.