811 resultados para Architecture and Complexity
Resumo:
Pythagoras, Plato and Euclid’s paved the way for Classical Geometry. The idea of shapes that can be mathematically defined by equations led to the creation of great structures of modern and ancient civilizations, and milestones in mathematics and science. However, classical geometry fails to explain the complexity of non-linear shapes replete in nature such as the curvature of a flower or the wings of a Butterfly. Such non-linearity can be explained by fractal geometry which creates shapes that emulate those found in nature with remarkable accuracy. Such phenomenon begs the question of architectural origin for biological existence within the universe. While the concept of a unifying equation of life has yet to be discovered, the Fibonacci sequence may establish an origin for such a development. The observation of the Fibonacci sequence is existent in almost all aspects of life ranging from the leaves of a fern tree, architecture, and even paintings, makes it highly unlikely to be a stochastic phenomenon. Despite its wide-spread occurrence and existence, the Fibonacci series and the Rule of Golden Proportions has not been widely documented in the human body. This paper serves to review the observed documentation of the Fibonacci sequence in the human body.
Resumo:
Complexity science is the multidisciplinary study of complex systems. Its marked network orientation lends itself well to transport contexts. Key features of complexity science are introduced and defined, with a specific focus on the application to air traffic management. An overview of complex network theory is presented, with examples of its corresponding metrics and multiple scales. Complexity science is starting to make important contributions to performance assessment and system design: selected, applied air traffic management case studies are explored. The important contexts of uncertainty, resilience and emergent behaviour are discussed, with future research priorities summarised.
Resumo:
Abstract Complexity science and its methodological applications have increased in popularity in social science during the last two decades. One key concept within complexity science is that of self-organization. Self-organization is used to refer to the emergence of stable patterns through autonomous and self-reinforcing dynamics at the micro-level. In spite of its potential relevance for the study of social dynamics, the articulation and use of the concept of self-organization has been kept within the boundaries of complexity science and links to and from mainstream social science are scarce. These links can be difficult to establish, even for researchers working in social complexity with a background in social science, because of the theoretical and conceptual diversity and fragmentation in traditional social science. This article is meant to serve as a first step in the process of overcoming this lack of cross-fertilization between complexity and mainstream social science. A systematic review of the concept of self-organization and a critical discussion of similar notions in mainstream social science is presented, in an effort to help practitioners within subareas of complexity science to identify literature from traditional social science that could potentially inform their research.
Resumo:
Sports tourism has received growing attention in academic research over the past two decades (Weed and Bull, 2009, Gibson, 2005) but greater understanding of the consumer is needed, particularly the factors influencing decisions to include sport as part of a leisure trip. This paper provides, through a focus on the sport of golf, insight into the characteristics of the sports tourist and how sports tourist behaviours influence the selection of locations deemed suitable for sports participation. This qualitative research employs a grounded theory methodology, underpinned by a constructivist epistemology, to evaluate twenty-six in-depth interviews with golf tourists. The findings propose a model which explains the relationship between golf tourist behaviours and destination selection. This identifies six strands which determine the relationship between the golf tourist, golf behaviours and destination selection (constructing the golf holiday, emotional rewards of taking a trip, total trip spend, amenities and support facilities, course characteristics and reputation of the destination). Furthermore it illuminates the complexity of these relationships through recognition of four spheres of influence (group dynamics, competition and ability, golfing capital and intermediaries). Discussion elucidates how this increased understanding of the golf tourist behaviours and destination selection might be applied to other sports, with conclusions exploring implications for the sports tourism industry and destinations.
Resumo:
It's very difficult in traditional Japanese culture separate the landscape from the architecture. The Japanese architectural culture has its roots in China but soon this culture has developed its own culture and an aesthetic that was the result of a long isolation from the rest of the world. Zen Buddhism and the constant relationship with nature define the main characteristics of Japanese architecture: minimalism and simplicity. The architecture is a perfect balance of harmony, proportion and purity. This paper aims to analyze the cultural roots of the relationship between architecture and landscape in Japan and where the characteristics previously defined are very important for to know the significance of the Japanese architectural thinking.
Resumo:
Modern High-Performance Computing HPC systems are gradually increasing in size and complexity due to the correspondent demand of larger simulations requiring more complicated tasks and higher accuracy. However, as side effects of the Dennard’s scaling approaching its ultimate power limit, the efficiency of software plays also an important role in increasing the overall performance of a computation. Tools to measure application performance in these increasingly complex environments provide insights into the intricate ways in which software and hardware interact. The monitoring of the power consumption in order to save energy is possible through processors interfaces like Intel Running Average Power Limit RAPL. Given the low level of these interfaces, they are often paired with an application-level tool like Performance Application Programming Interface PAPI. Since several problems in many heterogeneous fields can be represented as a complex linear system, an optimized and scalable linear system solver algorithm can decrease significantly the time spent to compute its resolution. One of the most widely used algorithms deployed for the resolution of large simulation is the Gaussian Elimination, which has its most popular implementation for HPC systems in the Scalable Linear Algebra PACKage ScaLAPACK library. However, another relevant algorithm, which is increasing in popularity in the academic field, is the Inhibition Method. This thesis compares the energy consumption of the Inhibition Method and Gaussian Elimination from ScaLAPACK to profile their execution during the resolution of linear systems above the HPC architecture offered by CINECA. Moreover, it also collates the energy and power values for different ranks, nodes, and sockets configurations. The monitoring tools employed to track the energy consumption of these algorithms are PAPI and RAPL, that will be integrated with the parallel execution of the algorithms managed with the Message Passing Interface MPI.
Resumo:
Valproic acid (VPA) and trichostatin A (TSA) are known histone deacetylase inhibitors (HDACIs) with epigenetic activity that affect chromatin supra-organization, nuclear architecture, and cellular proliferation, particularly in tumor cells. In this study, chromatin remodeling with effects extending to heterochromatic areas was investigated by image analysis in non-transformed NIH 3T3 cells treated for different periods with different doses of VPA and TSA under conditions that indicated no loss of cell viability. Image analysis revealed chromatin decondensation that affected not only euchromatin but also heterochromatin, concomitant with a decreased activity of histone deacetylases and a general increase in histone H3 acetylation. Heterochromatin protein 1-α (HP1-α), identified immunocytochemically, was depleted from the pericentromeric heterochromatin following exposure to both HDACIs. Drastic changes affecting cell proliferation and micronucleation but not alteration in CCND2 expression and in ratios of Bcl-2/Bax expression and cell death occurred following a 48-h exposure of the NIH 3T3 cells particularly in response to higher doses of VPA. Our results demonstrated that even low doses of VPA (0.05 mM) and TSA (10 ng/ml) treatments for 1 h can affect chromatin structure, including that of the heterochromatin areas, in non-transformed cells. HP1-α depletion, probably related to histone demethylation at H3K9me3, in addition to the effect of VPA and TSA on histone H3 acetylation, is induced on NIH 3T3 cells. Despite these facts, alterations in cell proliferation and micronucleation, possibly depending on mitotic spindle defects, require a longer exposure to higher doses of VPA and TSA.
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
Science education is under revision. Recent changes in society require changes in education to respond to new demands. Scientific literacy can be considered a new goal of science education and the epistemological gap between natural sciences and literacy disciplines must be overcome. The history of science is a possible bridge to link these `two cultures` and to foster an interdisciplinary approach in the classroom. This paper acknowledges Darwin`s legacy and proposes the use of cartoons and narrative expositions to put this interesting chapter of science into its historical context. A five-lesson didactic sequence was developed to tell part of the story of Darwin`s expedition through South America for students from 10 to 12 years of age. Beyond geological and biological perspectives, the inclusion of historical, social and geographical facts demonstrated the beauty and complexity of the findings that Darwin employed to propose the theory of evolution.
Resumo:
This work aims at proposing the use of the evolutionary computation methodology in order to jointly solve the multiuser channel estimation (MuChE) and detection problems at its maximum-likelihood, both related to the direct sequence code division multiple access (DS/CDMA). The effectiveness of the proposed heuristic approach is proven by comparing performance and complexity merit figures with that obtained by traditional methods found in literature. Simulation results considering genetic algorithm (GA) applied to multipath, DS/CDMA and MuChE and multi-user detection (MuD) show that the proposed genetic algorithm multi-user channel estimation (GAMuChE) yields a normalized mean square error estimation (nMSE) inferior to 11%, under slowly varying multipath fading channels, large range of Doppler frequencies and medium system load, it exhibits lower complexity when compared to both maximum likelihood multi-user channel estimation (MLMuChE) and gradient descent method (GrdDsc). A near-optimum multi-user detector (MuD) based on the genetic algorithm (GAMuD), also proposed in this work, provides a significant reduction in the computational complexity when compared to the optimum multi-user detector (OMuD). In addition, the complexity of the GAMuChE and GAMuD algorithms were (jointly) analyzed in terms of number of operations necessary to reach the convergence, and compared to other jointly MuChE and MuD strategies. The joint GAMuChE-GAMuD scheme can be regarded as a promising alternative for implementing third-generation (3G) and fourth-generation (4G) wireless systems in the near future. Copyright (C) 2010 John Wiley & Sons, Ltd.
Resumo:
The soil bacterium Pseudomonas fluorescens Pf-5 produces two siderophores, a pyoverdine and enantio-pyochelin, and its proteome includes 45 TonB-dependent outer-membrane proteins, which commonly function in uptake of siderophores and other substrates from the environment. The 45 proteins share the conserved beta-barrel and plug domains of TonB-dependent proteins but only 18 of them have an N-terminal signaling domain characteristic of TonB-dependent transducers (TBDTs), which participate in cell-surface signaling systems. Phylogenetic analyses of the 18 TBDTs and 27 TonB-dependent receptors (TBDRs), which lack the N-terminal signaling domain, suggest a complex evolutionary history including horizontal transfer among different microbial lineages. Putative functions were assigned to certain TBDRs and TBDTs in clades including well-characterized orthologs from other Pseudomonas spp. A mutant of Pf-5 with deletions in pyoverdine and enantio-pyochelin biosynthesis genes was constructed and characterized for iron-limited growth and utilization of a spectrum of siderophores. The mutant could utilize as iron sources a large number of pyoverdines with diverse structures as well as ferric citrate, heme, and the siderophores ferrichrome, ferrioxamine B, enterobactin, and aerobactin. The diversity and complexity of the TBDTs and TBDRs with roles in iron uptake clearly indicate the importance of iron in the fitness and survival of Pf-5 in the environment.
Resumo:
In this paper we follow the BOID (Belief, Obligation, Intention, Desire) architecture to describe agents and agent types in Defeasible Logic. We argue, in particular, that the introduction of obligations can provide a new reading of the concepts of intention and intentionality. Then we examine the notion of social agent (i.e., an agent where obligations prevail over intentions) and discuss some computational and philosophical issues related to it. We show that the notion of social agent either requires more complex computations or has some philosophical drawbacks.
Resumo:
In 1984, George Orwell presented the future as a dystopian vision, where everyday existence was governed and redefined by an oppressive regime. Winston Smith's daily duties at the Ministry of Truth involved the invention, rewriting and erasing of fragments of history as a means of perpetuating contentment, uniformity and control. History, as Orwell described it in the novel 'was a palimpsest, scraped clean and reinscribed exactly as often as was necessary.' More that a quarter of a century after the publication of 1984, Michel Foucault discussed the cinematic representation and misrepresentation of French history and identity in terms of what he called the manipulation of 'popular memory'. In what was tantamount to a diluted version of Orwell's palimpsestic histories, Foucault stated that 'people are not shown what they were, but what they must remember having been.' This paper will investigate notions of memory, identity and the everyday through a discussion of the community of Celebration in Florida. Conceived in the 1990s, Celebration was designed around a fictionalised representation of pre 1940s small town America, using nostalgia for a mythologised past to create a sense of comfort, community and conformity among its residents. Adapting issues raised by Orwell, Foucault and Baudrillard, this paper will discuss the way in which architecture, like film and literature, can participate in what Foucault discussed as the manipulation of popular memory, inducing and exploiting a nostalgia for an everyday past that that never really existed.
Resumo:
Data mining is the process to identify valid, implicit, previously unknown, potentially useful and understandable information from large databases. It is an important step in the process of knowledge discovery in databases, (Olaru & Wehenkel, 1999). In a data mining process, input data can be structured, seme-structured, or unstructured. Data can be in text, categorical or numerical values. One of the important characteristics of data mining is its ability to deal data with large volume, distributed, time variant, noisy, and high dimensionality. A large number of data mining algorithms have been developed for different applications. For example, association rules mining can be useful for market basket problems, clustering algorithms can be used to discover trends in unsupervised learning problems, classification algorithms can be applied in decision-making problems, and sequential and time series mining algorithms can be used in predicting events, fault detection, and other supervised learning problems (Vapnik, 1999). Classification is among the most important tasks in the data mining, particularly for data mining applications into engineering fields. Together with regression, classification is mainly for predictive modelling. So far, there have been a number of classification algorithms in practice. According to (Sebastiani, 2002), the main classification algorithms can be categorized as: decision tree and rule based approach such as C4.5 (Quinlan, 1996); probability methods such as Bayesian classifier (Lewis, 1998); on-line methods such as Winnow (Littlestone, 1988) and CVFDT (Hulten 2001), neural networks methods (Rumelhart, Hinton & Wiliams, 1986); example-based methods such as k-nearest neighbors (Duda & Hart, 1973), and SVM (Cortes & Vapnik, 1995). Other important techniques for classification tasks include Associative Classification (Liu et al, 1998) and Ensemble Classification (Tumer, 1996).