831 resultados para computational complexity
Resumo:
Understanding how biological visual systems perform object recognition is one of the ultimate goals in computational neuroscience. Among the biological models of recognition the main distinctions are between feedforward and feedback and between object-centered and view-centered. From a computational viewpoint the different recognition tasks - for instance categorization and identification - are very similar, representing different trade-offs between specificity and invariance. Thus the different tasks do not strictly require different classes of models. The focus of the review is on feedforward, view-based models that are supported by psychophysical and physiological data.
Resumo:
Modeling and simulation permeate all areas of business, science and engineering. With the increase in the scale and complexity of simulations, large amounts of computational resources are required, and collaborative model development is needed, as multiple parties could be involved in the development process. The Grid provides a platform for coordinated resource sharing and application development and execution. In this paper, we survey existing technologies in modeling and simulation, and we focus on interoperability and composability of simulation components for both simulation development and execution. We also present our recent work on an HLA-based simulation framework on the Grid, and discuss the issues to achieve composability.
Resumo:
We present a technique for the rapid and reliable evaluation of linear-functional output of elliptic partial differential equations with affine parameter dependence. The essential components are (i) rapidly uniformly convergent reduced-basis approximations — Galerkin projection onto a space WN spanned by solutions of the governing partial differential equation at N (optimally) selected points in parameter space; (ii) a posteriori error estimation — relaxations of the residual equation that provide inexpensive yet sharp and rigorous bounds for the error in the outputs; and (iii) offline/online computational procedures — stratagems that exploit affine parameter dependence to de-couple the generation and projection stages of the approximation process. The operation count for the online stage — in which, given a new parameter value, we calculate the output and associated error bound — depends only on N (typically small) and the parametric complexity of the problem. The method is thus ideally suited to the many-query and real-time contexts. In this paper, based on the technique we develop a robust inverse computational method for very fast solution of inverse problems characterized by parametrized partial differential equations. The essential ideas are in three-fold: first, we apply the technique to the forward problem for the rapid certified evaluation of PDE input-output relations and associated rigorous error bounds; second, we incorporate the reduced-basis approximation and error bounds into the inverse problem formulation; and third, rather than regularize the goodness-of-fit objective, we may instead identify all (or almost all, in the probabilistic sense) system configurations consistent with the available experimental data — well-posedness is reflected in a bounded "possibility region" that furthermore shrinks as the experimental error is decreased.
Resumo:
One of the tantalising remaining problems in compositional data analysis lies in how to deal with data sets in which there are components which are essential zeros. By an essential zero we mean a component which is truly zero, not something recorded as zero simply because the experimental design or the measuring instrument has not been sufficiently sensitive to detect a trace of the part. Such essential zeros occur in many compositional situations, such as household budget patterns, time budgets, palaeontological zonation studies, ecological abundance studies. Devices such as nonzero replacement and amalgamation are almost invariably ad hoc and unsuccessful in such situations. From consideration of such examples it seems sensible to build up a model in two stages, the first determining where the zeros will occur and the second how the unit available is distributed among the non-zero parts. In this paper we suggest two such models, an independent binomial conditional logistic normal model and a hierarchical dependent binomial conditional logistic normal model. The compositional data in such modelling consist of an incidence matrix and a conditional compositional matrix. Interesting statistical problems arise, such as the question of estimability of parameters, the nature of the computational process for the estimation of both the incidence and compositional parameters caused by the complexity of the subcompositional structure, the formation of meaningful hypotheses, and the devising of suitable testing methodology within a lattice of such essential zero-compositional hypotheses. The methodology is illustrated by application to both simulated and real compositional data
Resumo:
We consider the optimization problem of safety stock placement in a supply chain, as formulated in [1]. We prove that this problem is NP-Hard for supply chains modeled as general acyclic networks. Thus, we do not expect to find a polynomial-time algorithm for safety stock placement for a general-network supply chain.
Resumo:
The system described herein represents the first example of a recommender system in digital ecosystems where agents negotiate services on behalf of small companies. The small companies compete not only with price or quality, but with a wider service-by-service composition by subcontracting with other companies. The final result of these offerings depends on negotiations at the scale of millions of small companies. This scale requires new platforms for supporting digital business ecosystems, as well as related services like open-id, trust management, monitors and recommenders. This is done in the Open Negotiation Environment (ONE), which is an open-source platform that allows agents, on behalf of small companies, to negotiate and use the ecosystem services, and enables the development of new agent technologies. The methods and tools of cyber engineering are necessary to build up Open Negotiation Environments that are stable, a basic condition for predictable business and reliable business environments. Aiming to build stable digital business ecosystems by means of improved collective intelligence, we introduce a model of negotiation style dynamics from the point of view of computational ecology. This model inspires an ecosystem monitor as well as a novel negotiation style recommender. The ecosystem monitor provides hints to the negotiation style recommender to achieve greater stability of an open negotiation environment in a digital business ecosystem. The greater stability provides the small companies with higher predictability, and therefore better business results. The negotiation style recommender is implemented with a simulated annealing algorithm at a constant temperature, and its impact is shown by applying it to a real case of an open negotiation environment populated by Italian companies
Resumo:
The Networks and Complexity in Social Systems course commences with an overview of the nascent field of complex networks, dividing it into three related but distinct strands: Statistical description of large scale networks, viewed as static objects; the dynamic evolution of networks, where now the structure of the network is understood in terms of a growth process; and dynamical processes that take place on fixed networks; that is, "networked dynamical systems". (A fourth area of potential research ties all the previous three strands together under the rubric of co-evolution of networks and dynamics, but very little research has been done in this vein and so it is omitted.) The remainder of the course treats each of the three strands in greater detail, introducing technical knowledge as required, summarizing the research papers that have introduced the principal ideas, and pointing out directions for future development. With regard to networked dynamical systems, the course treats in detail the more specific topic of information propagation in networks, in part because this topic is of great relevance to social science, and in part because it has received the most attention in the literature to date.
Resumo:
High-level introduction for web science students, rather than for computer science students.
Resumo:
some resources on agile methods and enterprise architecture frameworks
Resumo:
In this session we look at how to think systematically about a problem and create a solution. We look at the definition and characteristics of an algorithm, and see how through modularisation and decomposition we can then choose a set of methods to create. We also compare this somewhat procedural approach, with the way that design works in Object Oriented Systems,
Resumo:
.
Resumo:
Se propone integrar los esfuerzos provenientes de las ciencias sociales para el desarrollo de las herramientas de gestión a través de las ciencias computacionales. Se busca desarrollar propuestas metodológicas que permitan el mejoramiento de un modelo computacional que haga posible el simular el desempeño de una marca dada, asociada a una empresa, frente a sus consumidores. Se procura que con esta monografía se establezcan formas que permitan una óptima recolección de información, insumo clave dentro de un modelo de simulación de inteligencia artificial que se aplicará al comportamiento de grupos poblacionales buscando comprender la respuestas que presentan los sujetos frente a la marca organizacional a partir del principio percepción-razonamiento-acción.
Resumo:
La monografía presenta la auto-organización sociopolítica como la mejor manera de lograr patrones organizados en los sistemas sociales humanos, dada su naturaleza compleja y la imposibilidad de las tareas computacionales de los regímenes políticos clásico, debido a que operan con control jerárquico, el cual ha demostrado no ser óptimo en la producción de orden en los sistemas sociales humanos. En la monografía se extrapola la teoría de la auto-organización en los sistemas biológicos a las dinámicas sociopolíticas humanas, buscando maneras óptimas de organizarlas, y se afirma que redes complejas anárquicas son la estructura emergente de la auto-organización sociopolítica.
Estado situacional de los modelos basados en agentes y su impacto en la investigación organizacional
Resumo:
En un mundo hiperconectado, dinámico y cargado de incertidumbre como el actual, los métodos y modelos analíticos convencionales están mostrando sus limitaciones. Las organizaciones requieren, por tanto, herramientas útiles que empleen tecnología de información y modelos de simulación computacional como mecanismos para la toma de decisiones y la resolución de problemas. Una de las más recientes, potentes y prometedoras es el modelamiento y la simulación basados en agentes (MSBA). Muchas organizaciones, incluidas empresas consultoras, emplean esta técnica para comprender fenómenos, hacer evaluación de estrategias y resolver problemas de diversa índole. Pese a ello, no existe (hasta donde conocemos) un estado situacional acerca del MSBA y su aplicación a la investigación organizacional. Cabe anotar, además, que por su novedad no es un tema suficientemente difundido y trabajado en Latinoamérica. En consecuencia, este proyecto pretende elaborar un estado situacional sobre el MSBA y su impacto sobre la investigación organizacional.
Resumo:
Para el administrador el proceso de la toma de decisiones es uno de sus mayores retos y responsabilidades, ya que en su desarrollo se debe definir el camino más acertado en un sin número de alternativas, teniendo en cuenta los obstáculos sociales, políticos y económicos del entorno empresarial. Para llegar a la decisión adecuada no hay que perder de vista los objetivos y metas propuestas, además de tener presente el proceso lógico, detectando, analizando y demostrando el porqué de esa elección. Consecuentemente el análisis que propone esta investigación aportara conocimientos sobre los tipos de lógica utilizados en la toma de decisiones estratégicas al administrador para satisfacer las demandas asociadas con el mercadeo para que de esta manera se pueda generar y ampliar eficientemente las competencia idóneas del administrador en la inserción internacional de un mercado laboral cada vez mayor (Valero, 2011). A lo largo de la investigación se pretende desarrollar un estudio teórico para explicar la relación entre la lógica y la toma de decisiones estratégicas de marketing y como estos conceptos se combinan para llegar a un resultado final. Esto se llevara a cabo por medio de un análisis de planes de marketing, iniciando por conceptos básicos como marketing, lógica, decisiones estratégicas, dirección de marketing seguido de los principios lógicos y contradicciones que se pueden llegar a generar entre la fundamentación teórica