21 resultados para Modern and contemporany architecture
Resumo:
This thesis explores how architecture can adapt local vernacular design principles to contemporary building design in a rural setting. Vernacular buildings in Guyana present a unique and coherent set of design principles developed in response to climatic and cultural conditions. The concept of “habitus” proposed by philosopher Pierre Bourdieu describing the evolving nature of social culture was used to interpret Guyanese local buildings. These principles were then applied to the design of a Women’s Center in the village of Port Mourant on the east coast of Guyana. The design specifically interpreted the “bottom-house” of local Guyanese architecture, an inherently flexible transitional outdoor space beneath raised buildings. The design of the Women’s Center demonstrates how contemporary architectural design can respond to climatic requirements, local preferences and societal needs to support the local culture.
Resumo:
The purpose of this thesis is to explore the design of mobile architecture that challenges traditional ideas of site through the design of a museum to commemorate immigration to the United States. This thesis develops a floating, moveable, inhabitable structure that moves on the inter-coastal waterways of South Florida, within the public areas of Miami. The floating museum offers new perceptions of the city and new means of occupying its various settings. Its architectural elements do not change but are read differently in each location. The museum brings its exhibitions to the city as an event. One moment it is there and the next it is gone. In its design, the Museum of Immigration explores the experience of leaving one place to settle in another. As a prototype, it might be the first in a series of such buildings around the country that offers a new relationship between building and site.
Resumo:
A nuclear waste stream is the complete flow of waste material from origin to treatment facility to final disposal. The objective of this study was to design and develop a Geographic Information Systems (GIS) module using Google Application Programming Interface (API) for better visualization of nuclear waste streams that will identify and display various nuclear waste stream parameters. A proper display of parameters would enable managers at Department of Energy waste sites to visualize information for proper planning of waste transport. The study also developed an algorithm using quadratic Bézier curve to make the map more understandable and usable. Microsoft Visual Studio 2012 and Microsoft SQL Server 2012 were used for the implementation of the project. The study has shown that the combination of several technologies can successfully provide dynamic mapping functionality. Future work should explore various Google Maps API functionalities to further enhance the visualization of nuclear waste streams.
Resumo:
The purpose of this thesis was to redesign a commercial center in Miami, Florida in a manner that incorporates the needs of pedestrians as well as the automobile. In my research, I studied projects that had been successful at integrating cars in retail design. I applied the strategies learned from this research to the design of a center that creates a positive interaction of pedestrian and car traffic, addressing the needs of the surrounding community. I designed a master plan that includes a mix of residential, retail, commercial and parking space. The parking is designed so that the retail center is not dominated by surface parking. Rather, the automobile is introduced into the different layers of the proposed buildings. The design focused on connecting pedestrian plazas and parking areas beneath them through the introduction of light and greenery. The findings show how a shopping center might transform the area around it by including spaces for residential, civic, cultural and social functions, as well as for the automotive infrastructure that make those functions possible.
Resumo:
Today, modern System-on-a-Chip (SoC) systems have grown rapidly due to the increased processing power, while maintaining the size of the hardware circuit. The number of transistors on a chip continues to increase, but current SoC designs may not be able to exploit the potential performance, especially with energy consumption and chip area becoming two major concerns. Traditional SoC designs usually separate software and hardware. Thus, the process of improving the system performance is a complicated task for both software and hardware designers. The aim of this research is to develop hardware acceleration workflow for software applications. Thus, system performance can be improved with constraints of energy consumption and on-chip resource costs. The characteristics of software applications can be identified by using profiling tools. Hardware acceleration can have significant performance improvement for highly mathematical calculations or repeated functions. The performance of SoC systems can then be improved, if the hardware acceleration method is used to accelerate the element that incurs performance overheads. The concepts mentioned in this study can be easily applied to a variety of sophisticated software applications. The contributions of SoC-based hardware acceleration in the hardware-software co-design platform include the following: (1) Software profiling methods are applied to H.264 Coder-Decoder (CODEC) core. The hotspot function of aimed application is identified by using critical attributes such as cycles per loop, loop rounds, etc. (2) Hardware acceleration method based on Field-Programmable Gate Array (FPGA) is used to resolve system bottlenecks and improve system performance. The identified hotspot function is then converted to a hardware accelerator and mapped onto the hardware platform. Two types of hardware acceleration methods – central bus design and co-processor design, are implemented for comparison in the proposed architecture. (3) System specifications, such as performance, energy consumption, and resource costs, are measured and analyzed. The trade-off of these three factors is compared and balanced. Different hardware accelerators are implemented and evaluated based on system requirements. 4) The system verification platform is designed based on Integrated Circuit (IC) workflow. Hardware optimization techniques are used for higher performance and less resource costs. Experimental results show that the proposed hardware acceleration workflow for software applications is an efficient technique. The system can reach 2.8X performance improvements and save 31.84% energy consumption by applying the Bus-IP design. The Co-processor design can have 7.9X performance and save 75.85% energy consumption.
Resumo:
Thanks to the advanced technologies and social networks that allow the data to be widely shared among the Internet, there is an explosion of pervasive multimedia data, generating high demands of multimedia services and applications in various areas for people to easily access and manage multimedia data. Towards such demands, multimedia big data analysis has become an emerging hot topic in both industry and academia, which ranges from basic infrastructure, management, search, and mining to security, privacy, and applications. Within the scope of this dissertation, a multimedia big data analysis framework is proposed for semantic information management and retrieval with a focus on rare event detection in videos. The proposed framework is able to explore hidden semantic feature groups in multimedia data and incorporate temporal semantics, especially for video event detection. First, a hierarchical semantic data representation is presented to alleviate the semantic gap issue, and the Hidden Coherent Feature Group (HCFG) analysis method is proposed to capture the correlation between features and separate the original feature set into semantic groups, seamlessly integrating multimedia data in multiple modalities. Next, an Importance Factor based Temporal Multiple Correspondence Analysis (i.e., IF-TMCA) approach is presented for effective event detection. Specifically, the HCFG algorithm is integrated with the Hierarchical Information Gain Analysis (HIGA) method to generate the Importance Factor (IF) for producing the initial detection results. Then, the TMCA algorithm is proposed to efficiently incorporate temporal semantics for re-ranking and improving the final performance. At last, a sampling-based ensemble learning mechanism is applied to further accommodate the imbalanced datasets. In addition to the multimedia semantic representation and class imbalance problems, lack of organization is another critical issue for multimedia big data analysis. In this framework, an affinity propagation-based summarization method is also proposed to transform the unorganized data into a better structure with clean and well-organized information. The whole framework has been thoroughly evaluated across multiple domains, such as soccer goal event detection and disaster information management.