993 resultados para Software Complexity
Resumo:
La ventilació és un procés fonamental, que influeix en el clima interior de l'hivernacle. La ventilació contribueix al control de la temperatura, la humitat i la concentració de gasos (com el CO2) de l'aire interior i, en conseqüència, influeix en el creixement i desenvolupament dels conreus. Malgrat la seva importància, el seu càlcul resulta una mica complex. Amb l'objectiu de facilitar una aproximació a aquest valor, es presenta aquest full de càlcul. Introduint les dades del vostre hivernacle multitúnel i les condicions de vent, direcció i velocitat, s'obtindrà un valor d'aquestes renovacions. Instruccions per al Càlcul de la Taxa de Ventilació de l'Hivernacle Multitúnel 1. Introduir les dimensions de l'hivernacle (cel.les en vermell) 2. Característiques de les finestres 3. Introduir la velocitat del vent 4. Introduir la direcció en què es troben orientades les finestres en relació al vent
Resumo:
The Whitehead minimization problem consists in finding a minimum size element in the automorphic orbit of a word, a cyclic word or a finitely generated subgroup in a finite rank free group. We give the first fully polynomial algorithm to solve this problem, that is, an algorithm that is polynomial both in the length of the input word and in the rank of the free group. Earlier algorithms had an exponential dependency in the rank of the free group. It follows that the primitivity problem – to decide whether a word is an element of some basis of the free group – and the free factor problem can also be solved in polynomial time.
Resumo:
En aquest projecte s'ha realitzat l'anàlisi, disseny i implementació d'un protocol de migració d'agents software basat en l'enviament del codi dels agents fragmentat en múltiples missatges. Aquest protocol es troba dins d'una arquitectura de migració multi-protocol per a la mobilitat d'agents entre plataformes JADE. Finalment, s'ha realitzat un estudi que compara el rendiment assolit pel protocol i les prestacions que aporta.
Resumo:
La complexitat de disseny d’agents mòbils creix a mesura que s’incrementen les seves funcionalitats. Aquest projecte proposa enfocar el problema des d’un punt de vista modular. S’ha realitzat un estudi tant dels propis agents com de les parts que ho integren. De la mateixa forma, s’han establert i s'han implementat els mecanismes necessaris per habilitar les comunicacions segures entre agents. Finalment, s’han desenvolupat dos components que ofereixen les funcionalitats de seguiment de l’agent mòbil i la recuperació dels resultats generats. El desenvolupament d’agents basats en components tracta d’aplicar la vella estratègia "divideix i venceràs" a la fase de disseny, reduint, així,la seva gran complexitat.
Resumo:
Neuroblastoma (NB) is a neural crest-derived childhood tumor characterized by a remarkable phenotypic diversity, ranging from spontaneous regression to fatal metastatic disease. Although the cancer stem cell (CSC) model provides a trail to characterize the cells responsible for tumor onset, the NB tumor-initiating cell (TIC) has not been identified. In this study, the relevance of the CSC model in NB was investigated by taking advantage of typical functional stem cell characteristics. A predictive association was established between self-renewal, as assessed by serial sphere formation, and clinical aggressiveness in primary tumors. Moreover, cell subsets gradually selected during serial sphere culture harbored increased in vivo tumorigenicity, only highlighted in an orthotopic microenvironment. A microarray time course analysis of serial spheres passages from metastatic cells allowed us to specifically "profile" the NB stem cell-like phenotype and to identify CD133, ABC transporter, and WNT and NOTCH genes as spheres markers. On the basis of combined sphere markers expression, at least two distinct tumorigenic cell subpopulations were identified, also shown to preexist in primary NB. However, sphere markers-mediated cell sorting of parental tumor failed to recapitulate the TIC phenotype in the orthotopic model, highlighting the complexity of the CSC model. Our data support the NB stem-like cells as a dynamic and heterogeneous cell population strongly dependent on microenvironmental signals and add novel candidate genes as potential therapeutic targets in the control of high-risk NB.
Resumo:
Desde 1999 el Consorcio de Bibliotecas Universitarias de Cataluña (CBUC) ha creado una nueva línea de trabajo, junto con el Centro de Supercomputación de Cataluña (CESCA), para promocionar la investigación que se lleva a cabo en Cataluña y al mismo tiempo contribuir al movimiento mundial de depositar la producción académica y de investigación en la red de forma abierta. Este movimiento mundial, que recibe el nombre de Open Access, ha sido puesto en marcha con la finalidad de crear alternativas al paradigma de pagar por tener acceso a la información que se ha elaborado, muy a menudo, con financiación y recursos públicos. Esta nueva línea de trabajo son los depósitos institucionales. En esta comunicación presentamos brevemente el estado actual de los depósitos cooperativos implementados, su contenido (estándares usados, derechos de autor, preservación, etc.) y su continente (programas y tecnología utilizada, protocolos, etc.).
Resumo:
Report for the scientific sojourn at the German Aerospace Center (DLR) , Germany, during June and July 2006. The main objective of the two months stay has been to apply the techniques of LEO (Low Earth Orbiters) satellites GPS navigation which DLR currently uses in real time navigation. These techniques comprise the use of a dynamical model which takes into account the precise earth gravity field and models to account for the effects which perturb the LEO’s motion (such as drag forces due to earth’s atmosphere, solar pressure, due to the solar radiation impacting on the spacecraft, luni-solar gravity, due to the perturbation of the gravity field for the sun and moon attraction, and tidal forces, due to the ocean and solid tides). A high parameterized software was produced in the first part of work, which has been used to asses which accuracy could be reached exploring different models and complexities. The objective was to study the accuracy vs complexity, taking into account that LEOs at different heights have different behaviors. In this frame, several LEOs have been selected in a wide range of altitudes, and several approaches with different complexity have been chosen. Complexity is a very important issue, because processors onboard spacecrafts have very limited computing and memory resources, so it is mandatory to keep the algorithms simple enough to let the satellite process it by itself.
Resumo:
This paper reports on: (a) new primary source evidence on; and (b) statistical and econometric analysis of high technology clusters in Scotland. It focuses on the following sectors: software, life sciences, microelectronics, optoelectronics, and digital media. Evidence on a postal and e-mailed questionnaire is presented and discussed under the headings of: performance, resources, collaboration & cooperation, embeddedness, and innovation. The sampled firms are characterised as being small (viz. micro-firms and SMEs), knowledge intensive (largely graduate staff), research intensive (mean spend on R&D GBP 842k), and internationalised (mainly selling to markets beyond Europe). Preliminary statistical evidence is presented on Gibrat’s Law (independence of growth and size) and the Schumpeterian Hypothesis (scale economies in R&D). Estimates suggest a short-run equilibrium size of just 100 employees, but a long-run equilibrium size of 1000 employees. Further, to achieve the Schumpeterian effect (of marked scale economies in R&D), estimates suggest that firms have to grow to very much larger sizes of beyond 3,000 employees. We argue that the principal way of achieving the latter scale may need to be by takeovers and mergers, rather than by internally driven growth.
Resumo:
El present treball compta amb dues parts. La primera es una recopilació sobre temes relacionats amb el correu electrònic i el seu ús: la seva història; els elements que el composen; serveis i programes que ofereix, l´ús d’aquesta eina; l’importància d´aquest dins del e-marketing; la seva efectivitat com a eina de marketing; atributs que se li assignen; les seves principals aplicacions; legislació que el regula; i altres dades que poden ser de gran utilitat a l´hora de fer una tramesa de correu electrònic. La segona part d´aquest treball conté una investigació quantitativa sobre alguns elements o variables que poden influir en l´efectivitat final de la tramesa massiva de correus electrònics realitzada per una empresa amb finalitat comercial.
Resumo:
A mesura que la complexitat de les tasques dels agents mòbils va creixent, és més important que aquestes no perdin el treball realitzat. Hem de saber en tot moment que la execució s’està desenvolupant favorablement. Aquest projecte tracta d’explicar el procés d’elaboració d’un component de tolerància a fallades des de la seva idea inicial fins a la seva implementació. Analitzarem la situació i dissenyarem una solució. Procurarem que el nostre component emmascari la fallada d’un agent, detectant-la i posteriorment recuperant l’execució des d’on s’ha interromput. Tot això procurant seguir la metodologia de disseny d’agents mòbils per a plataformes lleugeres.
Resumo:
I develop a model of endogenous bounded rationality due to search costs, arising implicitly from the problems complexity. The decision maker is not required to know the entire structure of the problem when making choices but can think ahead, through costly search, to reveal more of it. However, the costs of search are not assumed exogenously; they are inferred from revealed preferences through her choices. Thus, bounded rationality and its extent emerge endogenously: as problems become simpler or as the benefits of deeper search become larger relative to its costs, the choices more closely resemble those of a rational agent. For a fixed decision problem, the costs of search will vary across agents. For a given decision maker, they will vary across problems. The model explains, therefore, why the disparity, between observed choices and those prescribed under rationality, varies across agents and problems. It also suggests, under reasonable assumptions, an identifying prediction: a relation between the benefits of deeper search and the depth of the search. As long as calibration of the search costs is possible, this can be tested on any agent-problem pair. My approach provides a common framework for depicting the underlying limitations that force departures from rationality in different and unrelated decision-making situations. Specifically, I show that it is consistent with violations of timing independence in temporal framing problems, dynamic inconsistency and diversification bias in sequential versus simultaneous choice problems, and with plausible but contrasting risk attitudes across small- and large-stakes gambles.
Resumo:
This paper critically examines a number of issues relating to the measurement of tax complexity. It starts with an analysis of the concept of tax complexity, distinguishing tax design complexity and operational complexity. It considers the consequences/costs of complexity, and then examines the rationale for measuring complexity. Finally it applies the analysis to an examination of an index of complexity developed by the UK Office of Tax Simplification (OTS).