977 resultados para Information complexity
Resumo:
An extensive off-line evaluation of the Noah/Single Layer Urban Canopy Model (Noah/SLUCM) urban land-surface model is presented using data from 15 sites to assess (1) the ability of the scheme to reproduce the surface energy balance observed in a range of urban environments, including seasonal changes, and (2) the impact of increasing complexity of input parameter information. Model performance is found to be most dependent on representation of vegetated surface area cover; refinement of other parameter values leads to smaller improvements. Model biases in net all-wave radiation and trade-offs between turbulent heat fluxes are highlighted using an optimization algorithm. Here we use the Urban Zones to characterize Energy partitioning (UZE) as the basis to assign default SLUCM parameter values. A methodology (FRAISE) to assign sites (or areas) to one of these categories based on surface characteristics is evaluated. Using three urban sites from the Basel Urban Boundary Layer Experiment (BUBBLE) dataset, an independent evaluation of the model performance with the parameter values representative of each class is performed. The scheme copes well with both seasonal changes in the surface characteristics and intra-urban heterogeneities in energy flux partitioning, with RMSE performance comparable to similar state-of-the-art models for all fluxes, sites and seasons. The potential of the methodology for high-resolution atmospheric modelling application using the Weather Research and Forecasting (WRF) model is highlighted. This analysis supports the recommendations that (1) three classes are appropriate to characterize the urban environment, and (2) that the parameter values identified should be adopted as default values in WRF.
Resumo:
A causal explanation provides information about the causal history of whatever is being explained. However, most causal histories extend back almost infinitely and can be described in almost infinite detail. Causal explanations therefore involve choices about which elements of causal histories to pick out. These choices are pragmatic: they reflect our explanatory interests. When adjudicating between competing causal explanations, we must therefore consider not only questions of epistemic adequacy (whether we have good grounds for identifying certain factors as causes) but also questions of pragmatic adequacy (whether the aspects of the causal history picked out are salient to our explanatory interests). Recognizing that causal explanations differ pragmatically as well as epistemically is crucial for identifying what is at stake in competing explanations of the relative peacefulness of the nineteenth-century Concert system. It is also crucial for understanding how explanations of past events can inform policy prescription.
Resumo:
In the past few years, libraries have started to design public programs that educate patrons about different tools and techniques to protect personal privacy. But do end user solutions provide adequate safeguards against surveillance by corporate and government actors? What does a comprehensive plan for privacy entail in order that libraries live up to their privacy values? In this paper, the authors discuss the complexity of surveillance architecture that the library institution might confront when seeking to defend the privacy rights of patrons. This architecture consists of three main parts: physical or material aspects, logical characteristics, and social factors of information and communication flows in the library setting. For each category, the authors will present short case studies that are culled from practitioner experience, research, and public discourse. The case studies probe the challenges faced by the library—not only when making hardware and software choices, but also choices related to staffing and program design. The paper shows that privacy choices intersect not only with free speech and chilling effects, but also with questions that concern intellectual property, organizational development, civic engagement, technological innovation, public infrastructure, and more. The paper ends with discussion of what libraries will require in order to sustain and improve efforts to serve as stewards of privacy in the 21st century.
Resumo:
The intent of this paper is to provide a practitioners insight into the present and foreseeable future of problem of transaction cost economics related to culture and business etiquette that may increase the of complexity of business communication. We will also explore whether it impacts participant's mindsets regarding opportunistic or passive aggressive behavior. We will study the role of culture, ethics, information asymmetry, and legal systems regarding their importance towards the business contracts and lack of knowledge in local environments. We will make connections to contract theory strategies and objectives and recommend business practices. Furthermore, economic theory explores the role of the impossibility of the perfect contract. Historical and present day operational factors are examined for the determination of forward-looking contract law indications worldwide. This paper is intended provide a practitioners view with a global perspective of a multinational, mid-sized and small corporations giving consideration in a non-partisan and non-nationalistic view, yet examines the individual characteristics of the operational necessities and obligations of any corporation. The study will be general, yet cite specific articles to each argument and give adequate consideration to the intricacies of the global asymmetry of information. This paper defends that corporations of any kind and size should be aware of the risk of international business etiquette and cultural barriers that might jeopardize the savings you could obtain from engaging international suppliers.
Resumo:
Develop software is still a risky business. After 60 years of experience, this community is still not able to consistently build Information Systems (IS) for organizations with predictable quality, within previously agreed budget and time constraints. Although software is changeable we are still unable to cope with the amount and complexity of change that organizations demand for their IS. To improve results, developers followed two alternatives: Frameworks that increase productivity but constrain the flexibility of possible solutions; Agile ways of developing software that keep flexibility with less upfront commitments. With strict frameworks, specific hacks have to be put in place to get around the framework construction options. In time this leads to inconsistent architectures that are harder to maintain due to incomplete documentation and human resources turnover. The main goals of this work is to create a new way to develop flexible IS for organizations, using web technologies, in a faster, better and cheaper way that is more suited to handle organizational change. To do so we propose an adaptive object model that uses a new ontology for data and action with strict normalizing rules. These rules should bound the effects of changes that can be better tested and therefore corrected. Interfaces are built with templates of resources that can be reused and extended in a flexible way. The “state of the world” for each IS is determined by all production and coordination acts that agents performed over time, even those performed by external systems. When bugs are found during maintenance, their past cascading effects can be checked through simulation, re-running the log of transaction acts over time and checking results with previous records. This work implements a prototype with part of the proposed system in order to have a preliminary assessment its feasibility and limitations.
Resumo:
Managing the great complexity of enterprise system, due to entities numbers, decision and process varieties involved to be controlled results in a very hard task because deals with the integration of its operations and its information systems. Moreover, the enterprises find themselves in a constant changing process, reacting in a dynamic and competitive environment where their business processes are constantly altered. The transformation of business processes into models allows to analyze and redefine them. Through computing tools usage it is possible to minimize the cost and risks of an enterprise integration design. This article claims for the necessity of modeling the processes in order to define more precisely the enterprise business requirements and the adequate usage of the modeling methodologies. Following these patterns, the paper concerns the process modeling relative to the domain of demand forecasting as a practical example. The domain of demand forecasting was built based on a theoretical review. The resulting models considered as reference model are transformed into information systems and have the aim to introduce a generic solution and be start point of better practical forecasting. The proposal is to promote the adequacy of the information system to the real needs of an enterprise in order to enable it to obtain and accompany better results, minimizing design errors, time, money and effort. The enterprise processes modeling are obtained with the usage of CIMOSA language and to the support information system it was used the UML language.
Resumo:
We investigate, from a philosophical perspective, the relation between abductive reasoning and information in the context of biological systems. Emphasis is given to the organizational role played by abductive reasoning in practical activities of embodied embedded agency that involve meaningful information. From this perspective, meaningful information is provisionally characterized as a selforganizing process of pattern generation that constrains coherent action. We argue that this process can be considered as a part of evolutionarily developed learning abilities of organisms in order to help with their survival. We investigate the case of inorganic mechanical systems (like robots), which deal only with stable forms of habits, rather than with evolving learning abilities. Some difficulties are considered concerning the hypothesis that mechanical systems may operate with meaningful information, present in abductive reasoning. Finally, an example of hypotheses creation in the domain of medical sciences is presented in order to illustrate the complexity of abduction in practical reasoning concerning human activities. © 2007 Springer-Verlag Berlin Heidelberg.
Resumo:
The results of the histopathological analyses after the implantation of highly crystalline PVA microspheres in subcutaneous tissues of Wistar rats are here in reported. Three different groups of PVA microparticles were systematically studied: highly crystalline, amorphous, and commercial ones. In addition to these experiments, complementary analyses of architectural complexity were performed using fractal dimension (FD), and Shannon's entropy (SE) concepts. The highly crystalline microspheres induced inflammatory reactions similar to the ones observed for the commercial ones, while the inflammatory reactions caused by the amorphous ones were less intense. Statistical analyses of the subcutaneous tissues of Wistar rats implanted with the highly crystalline microspheres resulted in FD and SE values significantly higher than the statistical parameters observed for the amorphous ones. The FD and SE parameters obtained for the subcutaneous tissues of Wistar rats implanted with crystalline and commercial microparticles were statistically similar. Briefly, the results indicated that the new highly crystalline microspheres had biocompatible behavior comparable to the commercial ones. In addition, statistical tools such as FD and SE analyses when combined with histopathological analyses can be useful tools to investigate the architectural complexity tissues caused by complex inflammatory reactions. © 2012 WILEY PERIODICALS, INC.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
We used the statistical measurements of information entropy, disequilibrium and complexity to infer a hierarchy of equations of state for two types of compact stars from the broad class of neutron stars, namely, with hadronic composition and with strange quark composition. Our results show that, since order costs energy. Nature would favor the exotic strange stars even though the question of how to form the strange stars cannot be answered within this approach. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
This paper presents a new parallel methodology for calculating the determinant of matrices of the order n, with computational complexity O(n), using the Gauss-Jordan Elimination Method and Chio's Rule as references. We intend to present our step-by-step methodology using clear mathematical language, where we will demonstrate how to calculate the determinant of a matrix of the order n in an analytical format. We will also present a computational model with one sequential algorithm and one parallel algorithm using a pseudo-code.
Resumo:
Introduction. Postnatal neurogenesis in the hippocampal dentate gyrus, can be modulated by numerous determinants, such as hormones, transmitters and stress. Among the factors positively interfering with neurogenesis, the complexity of the environment appears to play a particularly striking role. Adult mice reared in an enriched environment produce more neurons and exhibit better performance in hippocampus-specific learning tasks. While the effects of complex environments on hippocampal neurogenesis are well documented, there is a lack of information on the effects of living under socio-sensory deprivation conditions. Due to the immaturity of rats and mice at birth, studies dealing with the effects of environmental enrichment on hippocampal neurogenesis were carried out in adult animals, i.e. during a period of relatively low rate of neurogenesis. The impact of environment is likely to be more dramatic during the first postnatal weeks, because at this time granule cell production is remarkably higher than at later phases of development. The aim of the present research was to clarify whether and to what extent isolated or enriched rearing conditions affect hippocampal neurogenesis during the early postnatal period, a time window characterized by a high rate of precursor proliferation and to elucidate the mechanisms underlying these effects. The experimental model chosen for this research was the guinea pig, a precocious rodent, which, at 4-5 days of age can be independent from maternal care. Experimental design. Animals were assigned to a standard (control), an isolated, or an enriched environment a few days after birth (P5-P6). On P14-P17 animals received one daily bromodeoxyuridine (BrdU) injection, to label dividing cells, and were sacrificed either on P18, to evaluate cell proliferation or on P45, to evaluate cell survival and differentiation. Methods. Brain sections were processed for BrdU immunhistochemistry, to quantify the new born and surviving cells. The phenotype of the surviving cells was examined by means of confocal microscopy and immunofluorescent double-labeling for BrdU and either a marker of neurons (NeuN) or a marker of astrocytes (GFAP). Apoptotic cell death was examined with the TUNEL method. Serial sections were processed for immunohistochemistry for i) vimentin, a marker of radial glial cells, ii) BDNF (brain-derived neurotrofic factor), a neurotrophin involved in neuron proliferation/survival, iii) PSA-NCAM (the polysialylated form of the neural cell adhesion molecule), a molecule associated with neuronal migration. Total granule cell number in the dentate gyrus was evaluated by stereological methods, in Nissl-stained sections. Results. Effects of isolation. In P18 isolated animals we found a reduced cell proliferation (-35%) compared to controls and a lower expression of BDNF. Though in absolute terms P45 isolated animals had less surviving cells than controls, they showed no differences in survival rate and phenotype percent distribution compared to controls. Evaluation of the absolute number of surviving cells of each phenotype showed that isolated animals had a reduced number of cells with neuronal phenotype than controls. Looking at the location of the new neurons, we found that while in control animals 76% of them had migrated to the granule cell layer, in isolated animals only 55% of the new neurons had reached this layer. Examination of radial glia cells of P18 and P45 animals by vimentin immunohistochemistry showed that in isolated animals radial glia cells were reduced in density and had less and shorter processes. Granule cell count revealed that isolated animals had less granule cells than controls (-32% at P18 and -42% at P45). Effects of enrichment. In P18 enriched animals there was an increase in cell proliferation (+26%) compared to controls and a higher expression of BDNF. Though in both groups there was a decline in the number of BrdU-positive cells by P45, enriched animals had more surviving cells (+63) and a higher survival rate than controls. No differences were found between control and enriched animals in phenotype percent distribution. Evaluation of the absolute number of cells of each phenotype showed that enriched animals had a larger number of cells of each phenotype than controls. Looking at the location of cells of each phenotype we found that enriched animals had more new neurons in the granule cell layer and more astrocytes and cells with undetermined phenotype in the hilus. Enriched animals had a higher expression of PSA-NCAM in the granule cell layer and hilus Vimentin immunohistochemistry showed that in enriched animals radial glia cells were more numerous and had more processes.. Granule cell count revealed that enriched animals had more granule cells than controls (+37% at P18 and +31% at P45). Discussion. Results show that isolation rearing reduces hippocampal cell proliferation but does not affect cell survival, while enriched rearing increases both cell proliferation and cell survival. Changes in the expression of BDNF are likely to contribute to he effects of environment on precursor cell proliferation. The reduction and increase in final number of granule neurons in isolated and enriched animals, respectively, are attributable to the effects of environment on cell proliferation and survival and not to changes in the differentiation program. As radial glia cells play a pivotal role in neuron guidance to the granule cell layer, the reduced number of radial glia cells in isolated animals and the increased number in enriched animals suggests that the size of radial glia population may change dynamically, in order to match changes in neuron production. The high PSA-NCAM expression in enriched animals may concur to favor the survival of the new neurons by facilitating their migration to the granule cell layer. Conclusions. By using a precocious rodent we could demonstrate that isolated/enriched rearing conditions, at a time window during which intense granule cell proliferation takes place, lead to a notable decrease/increase of total granule cell number. The time-course and magnitude of postnatal granule cell production in guinea pigs are more similar to the human and non-human primate condition than in rats and mice. Translation of current data to humans would imply that exposure of children to environments poor/rich of stimuli may have a notably large impact on dentate neurogenesis and, very likely, on hippocampus dependent memory functions.
Resumo:
This thesis presents the outcomes of a Ph.D. course in telecommunications engineering. It is focused on the optimization of the physical layer of digital communication systems and it provides innovations for both multi- and single-carrier systems. For the former type we have first addressed the problem of the capacity in presence of several nuisances. Moreover, we have extended the concept of Single Frequency Network to the satellite scenario, and then we have introduced a novel concept in subcarrier data mapping, resulting in a very low PAPR of the OFDM signal. For single carrier systems we have proposed a method to optimize constellation design in presence of a strong distortion, such as the non linear distortion provided by satellites' on board high power amplifier, then we developed a method to calculate the bit/symbol error rate related to a given constellation, achieving an improved accuracy with respect to the traditional Union Bound with no additional complexity. Finally we have designed a low complexity SNR estimator, which saves one-half of multiplication with respect to the ML estimator, and it has similar estimation accuracy.