811 resultados para Architecture and Complexity
Resumo:
The International Long-Term Ecological Research (ILTER) network comprises > 600 scientific groups conducting site-based research within 40 countries. Its mission includes improving the understanding of global ecosystems and informs solutions to current and future environmental problems at the global scales. The ILTER network covers a wide range of social-ecological conditions and is aligned with the Programme on Ecosystem Change and Society (PECS) goals and approach. Our aim is to examine and develop the conceptual basis for proposed collaboration between ILTER and PECS. We describe how a coordinated effort of several contrasting LTER site-based research groups contributes to the understanding of how policies and technologies drive either toward or away from the sustainable delivery of ecosystem services. This effort is based on three tenets: transdisciplinary research; cross-scale interactions and subsequent dynamics; and an ecological stewardship orientation. The overarching goal is to design management practices taking into account trade-offs between using and conserving ecosystems toward more sustainable solutions. To that end, we propose a conceptual approach linking ecosystem integrity, ecosystem services, and stakeholder well-being, and as a way to analyze trade-offs among ecosystem services inherent in diverse management options. We also outline our methodological approach that includes: (i) monitoring and synthesis activities following spatial and temporal trends and changes on each site and by documenting cross-scale interactions; (ii) developing analytical tools for integration; (iii) promoting trans-site comparison; and (iv) developing conceptual tools to design adequate policies and management interventions to deal with trade-offs. Finally, we highlight the heterogeneity in the social-ecological setting encountered in a subset of 15 ILTER sites. These study cases are diverse enough to provide a broad cross-section of contrasting ecosystems with different policy and management drivers of ecosystem conversion; distinct trends of biodiversity change; different stakeholders’ preferences for ecosystem services; and diverse components of well-being issues.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
Background and problem – As a result of financial crises and the realization of a broader stakeholder network, recent decades have seen an increase in stakeholder demand for non- financial information in corporate reporting. This has led to a situation of information overload where separate financial and sustainability reports have developed in length and complexity interdependent of each other. Integrated reporting has been presented as a solution to this problematic situation. The question is whether the corporate world believe this to be the solution and if the development of corporate reporting is heading in this direction. Purpose - This thesis aims to examine and assess to what extent companies listed on the OMX Stockholm 30 (OMXS30), as per 2016-02-28, comply with the Strategic content element of the <IR> Framework and how this disclosure has developed since the framework’s pilot project and official release by using a self-constructed disclosure index based on its specific items. Methodology – The purpose was fulfilled through an analysis of 104 annual reports comprising 26 companies during the period of 2011-2014. The annual reports were assessed using a self-constructed disclosure index based on the <IR> Framework content element Strategy and Resource Allocation, where one point was given for each disclosed item. Analysis and conclusions – The study found that the OMXS30-listed companies to a large extent complies with the strategic content element of the <IR> Framework and that this compliance has seen a steady growth throughout the researched time span. There is still room for improvement however with a total average framework compliance of 84% for 2014. Although many items are being reported on, there are indications that companies generally miss out on the core values of Integrated reporting.
Resumo:
The evolution and maturation of Cloud Computing created an opportunity for the emergence of new Cloud applications. High-performance Computing, a complex problem solving class, arises as a new business consumer by taking advantage of the Cloud premises and leaving the expensive datacenter management and difficult grid development. Standing on an advanced maturing phase, today’s Cloud discarded many of its drawbacks, becoming more and more efficient and widespread. Performance enhancements, prices drops due to massification and customizable services on demand triggered an emphasized attention from other markets. HPC, regardless of being a very well established field, traditionally has a narrow frontier concerning its deployment and runs on dedicated datacenters or large grid computing. The problem with common placement is mainly the initial cost and the inability to fully use resources which not all research labs can afford. The main objective of this work was to investigate new technical solutions to allow the deployment of HPC applications on the Cloud, with particular emphasis on the private on-premise resources – the lower end of the chain which reduces costs. The work includes many experiments and analysis to identify obstacles and technology limitations. The feasibility of the objective was tested with new modeling, architecture and several applications migration. The final application integrates a simplified incorporation of both public and private Cloud resources, as well as HPC applications scheduling, deployment and management. It uses a well-defined user role strategy, based on federated authentication and a seamless procedure to daily usage with balanced low cost and performance.
Resumo:
A presente dissertação constitui o culminar do Mestrado Integrado em Arquitetura e Urbanismo, da Universidade Fernando Pessoa – Porto. Tem como objeto de estudo a indústria em Território de Urbanização Difusa, sendo o objetivo maior a sua compreensão através, principalmente, da relação entre a dispersão industrial e as diretrizes de ocupação determinadas pelos planos de ordenamento do território, no contexto do concelho. Com este intuito foram analisadas quatro zonas que abrangem parte das freguesias de Gandra, Rebordosa, Lordelo, Vilela, Sobrosa e Duas Igrejas, localizadas no noroeste do concelho de Paredes, as quais se observaram num período cronológico delimitado entre 1947 e 2011. Trata-se de um território de grande complexidade onde existe uma multifuncionalidade de espaços. Caracteriza-se, fundamentalmente, por uma ocupação difusa/dispersa, tendo como suporte a rede de estradas e caminhos públicos em que a atividade industrial predomina nas mais diversas dimensões, desde as zonas industriais aquela inserida nos aglomerados residenciais de cariz familiar. Esta dispersão intensificou-se ao longo do séc. XX, com a fixação dispersa das indústrias que promoveram o desenvolvimento de um modelo de economia familiar que se reflete no território. A implementação do Planeamento formal, em finais desse século, virá estabelecer regras que ainda assim parecem insuficientes para controlar os aspetos mais negativos das opções individuais pré-plano. Na verdade, o atual modelo de implantação industrial levanta muitas preocupações no que concerne à preservação das estruturas “verdes”, podendo vir a comprometer, no futuro, a sustentabilidade ambiental e ecológica deste território. Será com algumas breves reflexões a este propósito, após repassar as questões principais do trabalho, que o terminaremos.
Resumo:
This paper discusses the advantages of database-backed websites and describes the model for a library website implemented at the University of Nottingham using open source software, PHP and MySQL. As websites continue to grow in size and complexity it becomes increasingly important to introduce automation to help manage them. It is suggested that a database-backed website offers many advantages over one built from static HTML pages. These include a consistency of style and content, the ability to present different views of the same data, devolved editing and enhanced security. The University of Nottingham Library Services website is described and issues surrounding its design, technological implementation and management are explored.
Resumo:
What if the architectural process of making could incorporate time? All designers who impact the physical environment- consciously and unconsciously are gatekeepers of the past, commentators of the present, and speculators of the future. This project proposes the creation of architecture and adaptive public space that looks to historical memories, foster present day cultural formation, and new alternative visions for the city of the future. The thesis asks what it means to design for stasis and change in a variety of scales- urban, architectural, and detail and arrives at a speculated new neighborhood, institutional buildings, and landscape. Central to this project is the idea of the architect as archeologist, anthropologist, and artist. The project focuses on a rapidly changing part of the city of Fort Worth, Texas and assigns a multipurpose institutional buildings and public space as a method of investigation. The thesis hopes to further architectural discourse about into the role of architecture in the preservation of memory, adaptive potential of public spaces, and the role of time in architecture.
Resumo:
Many applications, including communications, test and measurement, and radar, require the generation of signals with a high degree of spectral purity. One method for producing tunable, low-noise source signals is to combine the outputs of multiple direct digital synthesizers (DDSs) arranged in a parallel configuration. In such an approach, if all noise is uncorrelated across channels, the noise will decrease relative to the combined signal power, resulting in a reduction of sideband noise and an increase in SNR. However, in any real array, the broadband noise and spurious components will be correlated to some degree, limiting the gains achieved by parallelization. This thesis examines the potential performance benefits that may arise from using an array of DDSs, with a focus on several types of common DDS errors, including phase noise, phase truncation spurs, quantization noise spurs, and quantizer nonlinearity spurs. Measurements to determine the level of correlation among DDS channels were made on a custom 14-channel DDS testbed. The investigation of the phase noise of a DDS array indicates that the contribution to the phase noise from the DACs can be decreased to a desired level by using a large enough number of channels. In such a system, the phase noise qualities of the source clock and the system cost and complexity will be the main limitations on the phase noise of the DDS array. The study of phase truncation spurs suggests that, at least in our system, the phase truncation spurs are uncorrelated, contrary to the theoretical prediction. We believe this decorrelation is due to the existence of an unidentified mechanism in our DDS array that is unaccounted for in our current operational DDS model. This mechanism, likely due to some timing element in the FPGA, causes some randomness in the relative phases of the truncation spurs from channel to channel each time the DDS array is powered up. This randomness decorrelates the phase truncation spurs, opening the potential for SFDR gain from using a DDS array. The analysis of the correlation of quantization noise spurs in an array of DDSs shows that the total quantization noise power of each DDS channel is uncorrelated for nearly all values of DAC output bits. This suggests that a near N gain in SQNR is possible for an N-channel array of DDSs. This gain will be most apparent for low-bit DACs in which quantization noise is notably higher than the thermal noise contribution. Lastly, the measurements of the correlation of quantizer nonlinearity spurs demonstrate that the second and third harmonics are highly correlated across channels for all frequencies tested. This means that there is no benefit to using an array of DDSs for the problems of in-band quantizer nonlinearities. As a result, alternate methods of harmonic spur management must be employed.
Resumo:
This research examines the process of placemaking in LeDroit Park, a residential Washington, DC, neighborhood with a historic district at its core. Unpacking the entwined physical and social evolution of the small community within the context of the Nation’s Capital, this analysis provides insight into the role of urban design and development as well as historic designation on shaping collective identity. Initially planned and designed in 1873 as a gated suburb just beyond the formal L’Enfant-designed city boundary, LeDroit Park was intended as a retreat for middle and upper-class European Americans from the growing density and social diversity of the city. With a mixture of large romantic revival mansions and smaller frame cottages set on grassy plots evocative of an idealized rural village, the physical design was intentionally inwardly-focused. This feeling of refuge was underscored with a physical fence that surrounded the development, intended to prevent African Americans from nearby Howard University and the surrounding neighborhood, from using the community’s private streets to access the City of Washington. Within two decades of its founding, LeDroit Park was incorporated into the District of Columbia, the surrounding fence was demolished, and the neighborhood was racially integrated. Due to increasingly stringent segregation laws and customs in the city, this period of integration lasted less than twenty years, and LeDroit Park developed into an elite African American enclave, using the urban design as a bulwark against the indignities of a segregated city. Throughout the 20th century housing infill and construction increased density, yet the neighborhood never lost the feeling of security derived from the neighborhood plan. Highlighting the architecture and street design, neighbors successfully received historic district designation in 1974 in order to halt campus expansion. After a stalemate that lasted two decades, the neighborhood began another period of transformation, both racial and socio-economic, catalyzed by a multi-pronged investment program led by Howard University. Through interviews with long-term and new community members, this investigation asserts that the 140-year development history, including recent physical interventions, is integral to placemaking, shaping the material character as well as the social identity of residents.
Resumo:
This portfolio thesis describes work undertaken by the author under the Engineering Doctorate program of the Institute for System Level Integration. It was carried out in conjunction with the sponsor company Teledyne Defence Limited. A radar warning receiver is a device used to detect and identify the emissions of radars. They were originally developed during the Second World War and are found today on a variety of military platforms as part of the platform’s defensive systems. Teledyne Defence has designed and built components and electronic subsystems for the defence industry since the 1970s. This thesis documents part of the work carried out to create Phobos, Teledyne Defence’s first complete radar warning receiver. Phobos was designed to be the first low cost radar warning receiver. This was made possible by the reuse of existing Teledyne Defence products, commercial off the shelf hardware and advanced UK government algorithms. The challenges of this integration are described and discussed, with detail given of the software architecture and the development of the embedded application. Performance of the embedded system as a whole is described and qualified within the context of a low cost system.
Resumo:
Apparitions of empire and imperial ideologies were deeply embedded in the International Exhibition, a distinct exhibitionary paradigm that came to prominence in the mid-nineteenth century. Exhibitions were platforms for the display of objects, the movement of people, and the dissemination of ideas across and between regions of the British Empire, thereby facilitating contact between its different cultures and societies. This thesis aims to disrupt a dominant understanding of International Exhibitions, which forwards the notion that all exhibitions, irrespective of when or where they were staged, upheld a singular imperial discourse (i.e. Greenhalgh 1988, Rydell 1984). Rather, this thesis suggests International Exhibitions responded to and reflected the unique social, political and economic circumstances in which they took place, functioning as cultural environments in which pressing concerns of the day were worked through. Understood thus, the International Exhibition becomes a space for self-presentation, serving as a stage from which a multitude of interests and identities were constructed, performed and projected. This thesis looks to the visual and material culture of the International Exhibition in order to uncover this more nuanced history, and foregrounds an analysis of the intersections between practices of exhibition-making and identity-making. The primary focus is a set of exhibitions held in Glasgow in the late-1880s and early-1900s, which extends the geographic and temporal boundaries of the existing scholarship. What is more, it looks at representations of Canada at these events, another party whose involvement in the International Exhibition tradition has gone largely unnoticed. Consequently, this thesis is a thematic investigation of the links between a municipality routinely deemed the ‘Second City of the Empire’ and a Dominion settler colony, two types of geographic setting rarely brought into dialogue. It analyses three key elements of the exhibition-making process, exploring how iconographies of ‘quasi-nationhood’ were expressed through an exhibition’s planning and negotiation, its architecture and its displays. This original research framework deliberately cuts across strata that continue to define conceptions of the British Empire, and pushes beyond a conceptual model defined by metropole and colony. Through examining International Exhibitions held in Glasgow in the late-Victorian and Edwardian periods, and visions of Canada in evidence at these events, the goal is to offer a novel intervention into the existing literature concerning the cultural history of empire, one that emphasises fluidity rather than fixity and which muddles the boundaries between centre and periphery.
Resumo:
This work is aimed at understanding and unifying information on epidemiological modelling methods and how those methods relate to public policy addressing human health, specifically in the context of infectious disease prevention, pandemic planning, and health behaviour change. This thesis employs multiple qualitative and quantitative methods, and presents as a manuscript of several individual, data-driven projects that are combined in a narrative arc. The first chapter introduces the scope and complexity of this interdisciplinary undertaking, describing several topical intersections of importance. The second chapter begins the presentation of original data, and describes in detail two exercises in computational epidemiological modelling pertinent to pandemic influenza planning and policy, and progresses in the next chapter to present additional original data on how the confidence of the public in modelling methodology may have an effect on their planned health behaviour change as recommended in public health policy. The thesis narrative continues in the final data-driven chapter to describe how health policymakers use modelling methods and scientific evidence to inform and construct health policies for the prevention of infectious diseases, and concludes with a narrative chapter that evaluates the breadth of this data and recommends strategies for the optimal use of modelling methodologies when informing public health policy in applied public health scenarios.
Resumo:
Due to the growth of design size and complexity, design verification is an important aspect of the Logic Circuit development process. The purpose of verification is to validate that the design meets the system requirements and specification. This is done by either functional or formal verification. The most popular approach to functional verification is the use of simulation based techniques. Using models to replicate the behaviour of an actual system is called simulation. In this thesis, a software/data structure architecture without explicit locks is proposed to accelerate logic gate circuit simulation. We call thus system ZSIM. The ZSIM software architecture simulator targets low cost SIMD multi-core machines. Its performance is evaluated on the Intel Xeon Phi and 2 other machines (Intel Xeon and AMD Opteron). The aim of these experiments is to: • Verify that the data structure used allows SIMD acceleration, particularly on machines with gather instructions ( section 5.3.1). • Verify that, on sufficiently large circuits, substantial gains could be made from multicore parallelism ( section 5.3.2 ). • Show that a simulator using this approach out-performs an existing commercial simulator on a standard workstation ( section 5.3.3 ). • Show that the performance on a cheap Xeon Phi card is competitive with results reported elsewhere on much more expensive super-computers ( section 5.3.5 ). To evaluate the ZSIM, two types of test circuits were used: 1. Circuits from the IWLS benchmark suit [1] which allow direct comparison with other published studies of parallel simulators.2. Circuits generated by a parametrised circuit synthesizer. The synthesizer used an algorithm that has been shown to generate circuits that are statistically representative of real logic circuits. The synthesizer allowed testing of a range of very large circuits, larger than the ones for which it was possible to obtain open source files. The experimental results show that with SIMD acceleration and multicore, ZSIM gained a peak parallelisation factor of 300 on Intel Xeon Phi and 11 on Intel Xeon. With only SIMD enabled, ZSIM achieved a maximum parallelistion gain of 10 on Intel Xeon Phi and 4 on Intel Xeon. Furthermore, it was shown that this software architecture simulator running on a SIMD machine is much faster than, and can handle much bigger circuits than a widely used commercial simulator (Xilinx) running on a workstation. The performance achieved by ZSIM was also compared with similar pre-existing work on logic simulation targeting GPUs and supercomputers. It was shown that ZSIM simulator running on a Xeon Phi machine gives comparable simulation performance to the IBM Blue Gene supercomputer at very much lower cost. The experimental results have shown that the Xeon Phi is competitive with simulation on GPUs and allows the handling of much larger circuits than have been reported for GPU simulation. When targeting Xeon Phi architecture, the automatic cache management of the Xeon Phi, handles and manages the on-chip local store without any explicit mention of the local store being made in the architecture of the simulator itself. However, targeting GPUs, explicit cache management in program increases the complexity of the software architecture. Furthermore, one of the strongest points of the ZSIM simulator is its portability. Note that the same code was tested on both AMD and Xeon Phi machines. The same architecture that efficiently performs on Xeon Phi, was ported into a 64 core NUMA AMD Opteron. To conclude, the two main achievements are restated as following: The primary achievement of this work was proving that the ZSIM architecture was faster than previously published logic simulators on low cost platforms. The secondary achievement was the development of a synthetic testing suite that went beyond the scale range that was previously publicly available, based on prior work that showed the synthesis technique is valid.
Resumo:
This thesis will address cultural and physical place reclamation, at the ambiguous intersection of ‘city’ and nature.’ By creating a juxtaposed sequence of multi-scalar interventions, which challenge the conventional boundaries of architecture, and landscape architecture; in order to make commonplace a new dynamic threshold condition in Richmond, Virginia. At its core, this thesis is an attempt at place-making on a site which has become ‘no place.’ This concept will be manifest via a landscape park on Mayo Island in Richmond, anchored by a community retreat center, and architectural follies along a constructed path. The interventions will coincide with value of place in historical Richmond: an integrated, socially desegregated waterfront hinge; a social nexus of inherent change, at the point which the river itself changes at the fall line.
Resumo:
Dissertação (mestrado)—Universidade de Brasília, Faculdade de Tecnologia, Departamento de Engenharia Elétrica, 2015.