983 resultados para Multimodal text construction
Resumo:
The challenges posed by global climate change are motivating the investigation of strategies that can reduce the life cycle greenhouse gas (GHG) emissions of products and processes. While new construction materials and technologies have received significant attention, there has been limited emphasis on understanding how construction processes can be best managed to reduce GHG emissions. Unexpected disruptive events tend to adversely impact construction costs and delay project completion. They also tend to increase project GHG emissions. The objective of this paper is to investigate ways in which project GHG emissions can be reduced by appropriate management of disruptive events. First, an empirical analysis of construction data from a specific highway construction project is used to illustrate the impact of unexpected schedule delays in increasing project GHG emissions. Next, a simulation based methodology is described to assess the effectiveness of alternative project management strategies in reducing GHG emissions. The contribution of this paper is that it explicitly considers projects emissions, in addition to cost and project duration, in developing project management strategies. Practical application of the method discussed in this paper will help construction firms reduce their project emissions through strategic project management, and without significant investment in new technology. In effect, this paper lays the foundation for best practices in construction management that will optimize project cost and duration, while minimizing GHG emissions.
Resumo:
Information management is a key aspect of successful construction projects. Having inaccurate measurements and conflicting data can lead to costly mistakes, and vague quantities can ruin estimates and schedules. Building information modeling (BIM) augments a 3D model with a wide variety of information, which reduces many sources of error and can detect conflicts before they occur. Because new technology is often more complex, it can be difficult to effectively integrate it with existing business practices. In this paper, we will answer two questions: How can BIM add value to construction projects? and What lessons can be learned from other companies that use BIM or other similar technology? Previous research focused on the technology as if it were simply a tool, observing problems that occurred while integrating new technology into existing practices. Our research instead looks at the flow of information through a company and its network, seeing all the actors as part of an ecosystem. Building upon this idea, we proposed the metaphor of an information supply chain to illustrate how BIM can add value to a construction project. This paper then concludes with two case studies. The first case study illustrates a failure in the flow of information that could have prevented by using BIM. The second case study profiles a leading design firm that has used BIM products for many years and shows the real benefits of using this program.
Resumo:
Students are now involved in a vastly different textual landscape than many English scholars, one that relies on the “reading” and interpretation of multiple channels of simultaneous information. As a response to these new kinds of literate practices, my dissertation adds to the growing body of research on multimodal literacies, narratology in new media, and rhetoric through an examination of the place of video games in English teaching and research. I describe in this dissertation a hybridized theoretical basis for incorporating video games in English classrooms. This framework for textual analysis includes elements from narrative theory in literary study, rhetorical theory, and literacy theory, and when combined to account for the multiple modalities and complexities of gaming, can provide new insights about those theories and practices across all kinds of media, whether in written texts, films, or video games. In creating this framework, I hope to encourage students to view texts from a meta-level perspective, encompassing textual construction, use, and interpretation. In order to foster meta-level learning in an English course, I use specific theoretical frameworks from the fields of literary studies, narratology, film theory, aural theory, reader-response criticism, game studies, and multiliteracies theory to analyze a particular video game: World of Goo. These theoretical frameworks inform pedagogical practices used in the classroom for textual analysis of multiple media. Examining a video game from these perspectives, I use analytical methods from each, including close reading, explication, textual analysis, and individual elements of multiliteracies theory and pedagogy. In undertaking an in-depth analysis of World of Goo, I demonstrate the possibilities for classroom instruction with a complex blend of theories and pedagogies in English courses. This blend of theories and practices is meant to foster literacy learning across media, helping students develop metaknowledge of their own literate practices in multiple modes. Finally, I outline a design for a multiliteracies course that would allow English scholars to use video games along with other texts to interrogate texts as systems of information. In doing so, students can hopefully view and transform systems in their own lives as audiences, citizens, and workers.
Resumo:
With energy demands and costs growing every day, the need for improving energy efficiency in electrical devices has become very important. Research into various methods of improving efficiency for all electrical components will be a key to meet future energy needs. This report documents the design, construction, and testing of a research quality electric machine dynamometer and test bed. This test cell system can be used for research in several areas including: electric drives systems, electric vehicle propulsion systems, power electronic converters, load/source element in an AC Microgrid, as well as many others. The test cell design criteria, and decisions, will be discussed in reference to user functionality and flexibility. The individual power components will be discussed in detail to how they relate to the project, highlighting any feature used in operation of the test cell. A project timeline will be discussed, clearly stating the work done by the different individuals involved in the project. In addition, the system will be parameterized and benchmark data will be used to provide the functional operation of the system. With energy demands and costs growing every day, the need for improving energy efficiency in electrical devices has become very important. Research into various methods of improving efficiency for all electrical components will be a key to meet future energy needs. This report documents the design, construction, and testing of a research quality electric machine dynamometer and test bed. This test cell system can be used for research in several areas including: electric drives systems, electric vehicle propulsion systems, power electronic converters, load/source element in an AC Microgrid, as well as many others. The test cell design criteria, and decisions, will be discussed in reference to user functionality and flexibility. The individual power components will be discussed in detail to how they relate to the project, highlighting any feature used in operation of the test cell. A project timeline will be discussed, clearly stating the work done by the different individuals involved in the project. In addition, the system will be parameterized and benchmark data will be used to provide the functional operation of the system.
Resumo:
Light-frame wood buildings are widely built in the United States (U.S.). Natural hazards cause huge losses to light-frame wood construction. This study proposes methodologies and a framework to evaluate the performance and risk of light-frame wood construction. Performance-based engineering (PBE) aims to ensure that a building achieves the desired performance objectives when subjected to hazard loads. In this study, the collapse risk of a typical one-story light-frame wood building is determined using the Incremental Dynamic Analysis method. The collapse risks of buildings at four sites in the Eastern, Western, and Central regions of U.S. are evaluated. Various sources of uncertainties are considered in the collapse risk assessment so that the influence of uncertainties on the collapse risk of lightframe wood construction is evaluated. The collapse risks of the same building subjected to maximum considered earthquakes at different seismic zones are found to be non-uniform. In certain areas in the U.S., the snow accumulation is significant and causes huge economic losses and threatens life safety. Limited study has been performed to investigate the snow hazard when combined with a seismic hazard. A Filtered Poisson Process (FPP) model is developed in this study, overcoming the shortcomings of the typically used Bernoulli model. The FPP model is validated by comparing the simulation results to weather records obtained from the National Climatic Data Center. The FPP model is applied in the proposed framework to assess the risk of a light-frame wood building subjected to combined snow and earthquake loads. The snow accumulation has a significant influence on the seismic losses of the building. The Bernoulli snow model underestimates the seismic loss of buildings in areas with snow accumulation. An object-oriented framework is proposed in this study to performrisk assessment for lightframe wood construction. For home owners and stake holders, risks in terms of economic losses is much easier to understand than engineering parameters (e.g., inter story drift). The proposed framework is used in two applications. One is to assess the loss of the building subjected to mainshock-aftershock sequences. Aftershock and downtime costs are found to be important factors in the assessment of seismic losses. The framework is also applied to a wood building in the state of Washington to assess the loss of the building subjected to combined earthquake and snow loads. The proposed framework is proven to be an appropriate tool for risk assessment of buildings subjected to multiple hazards. Limitations and future works are also identified.
Resumo:
Highway infrastructure plays a significant role in society. The building and upkeep of America’s highways provide society the necessary means of transportation for goods and services needed to develop as a nation. However, as a result of economic and social development, vast amounts of greenhouse gas emissions (GHG) are emitted into the atmosphere contributing to global climate change. In recognizing this, future policies may mandate the monitoring of GHG emissions from public agencies and private industries in order to reduce the effects of global climate change. To effectively reduce these emissions, there must be methods that agencies can use to quantify the GHG emissions associated with constructing and maintaining the nation’s highway infrastructure. Current methods for assessing the impacts of highway infrastructure include methodologies that look at the economic impacts (costs) of constructing and maintaining highway infrastructure over its life cycle. This is known as Life Cycle Cost Analysis (LCCA). With the recognition of global climate change, transportation agencies and contractors are also investigating the environmental impacts that are associated with highway infrastructure construction and rehabilitation. A common tool in doing so is the use of Life Cycle Assessment (LCA). Traditionally, LCA is used to assess the environmental impacts of products or processes. LCA is an emerging concept in highway infrastructure assessment and is now being implemented and applied to transportation systems. This research focuses on life cycle GHG emissions associated with the construction and rehabilitation of highway infrastructure using a LCA approach. Life cycle phases of the highway section include; the material acquisition and extraction, construction and rehabilitation, and service phases. Departing from traditional approaches that tend to use LCA as a way to compare alternative pavement materials or designs based on estimated inventories, this research proposes a shift to a context sensitive process-based approach that uses actual observed construction and performance data to calculate greenhouse gas emissions associated with highway construction and rehabilitation. The goal is to support strategies that reduce long-term environmental impacts. Ultimately, this thesis outlines techniques that can be used to assess GHG emissions associated with construction and rehabilitation operations to support the overall pavement LCA.
Resumo:
During the project, managers encounter numerous contingencies and are faced with the challenging task of making decisions that will effectively keep the project on track. This task is very challenging because construction projects are non-prototypical and the processes are irreversible. Therefore, it is critical to apply a methodological approach to develop a few alternative management decision strategies during the planning phase, which can be deployed to manage alternative scenarios resulting from expected and unexpected disruptions in the as-planned schedule. Such a methodology should have the following features but are missing in the existing research: (1) looking at the effects of local decisions on the global project outcomes, (2) studying how a schedule responds to decisions and disruptive events because the risk in a schedule is a function of the decisions made, (3) establishing a method to assess and improve the management decision strategies, and (4) developing project specific decision strategies because each construction project is unique and the lessons from a particular project cannot be easily applied to projects that have different contexts. The objective of this dissertation is to develop a schedule-based simulation framework to design, assess, and improve sequences of decisions for the execution stage. The contribution of this research is the introduction of applying decision strategies to manage a project and the establishment of iterative methodology to continuously assess and improve decision strategies and schedules. The project managers or schedulers can implement the methodology to develop and identify schedules accompanied by suitable decision strategies to manage a project at the planning stage. The developed methodology also lays the foundation for an algorithm towards continuously automatically generating satisfactory schedule and strategies through the construction life of a project. Different from studying isolated daily decisions, the proposed framework introduces the notion of {em decision strategies} to manage construction process. A decision strategy is a sequence of interdependent decisions determined by resource allocation policies such as labor, material, equipment, and space policies. The schedule-based simulation framework consists of two parts, experiment design and result assessment. The core of the experiment design is the establishment of an iterative method to test and improve decision strategies and schedules, which is based on the introduction of decision strategies and the development of a schedule-based simulation testbed. The simulation testbed used is Interactive Construction Decision Making Aid (ICDMA). ICDMA has an emulator to duplicate the construction process that has been previously developed and a random event generator that allows the decision-maker to respond to disruptions in the emulation. It is used to study how the schedule responds to these disruptions and the corresponding decisions made over the duration of the project while accounting for cascading impacts and dependencies between activities. The dissertation is organized into two parts. The first part presents the existing research, identifies the departure points of this work, and develops a schedule-based simulation framework to design, assess, and improve decision strategies. In the second part, the proposed schedule-based simulation framework is applied to investigate specific research problems.
Resumo:
Water resource depletion and sanitation are growing problems around the world. A solution to both of these problems is the use of composting latrines, as it requires no water and has been recommended by the World Health Organization as an improved sanitation technology. However, little analysis has been done on the decomposition process occurring inside the latrine, including what temperatures are reached and what variables most affect the composting process. Having better knowledge of how outside variables affect composting latrines can aid development workers on the choice of implementing such technology, and to better educate the users on the appropriate methods of maintenance. This report presents a full, detailed construction manual and temperature data analysis of a double vault composting latrine. During the author’s two year Peace Corps service in rural Paraguay he was involved with building twenty one composting latrines, and took detailed temperature readings and visual observations of his personal latrine for ten months. The author also took limited temperature readings of fourteen community member’s latrines over a three month period. These data points were analyzed to find correlations between compost temperatures and several variables. The two main variables found to affect the compost temperatures were the seasonal trends of the outside temperatures, and the mixing and addition of moisture to the compost. Outside seasonal temperature changes were compared to those of the compost and a linear regression was performed resulting in a R2-value of 0.89. Mixing the compost and adding water, or a water/urine mixture, resulted in temperature increases of the compost 100% of the time, with seasonal temperatures determining the rate and duration of the temperature increases. The temperature readings were also used to find events when certain temperatures were held for sufficient amounts of time to reach total pathogen destruction in the compost. Four different events were recorded when a temperature of 122°F (50°C) was held for at least 24 hours, ensuring total pathogen destruction in that area of the compost. One event of 114.8°F (46°C) held for one week was also recorded, again ensuring total pathogen destruction. Through the analysis of the temperature data, however, it was found that the compost only reached total pathogen destruction levels during ten percent of the data points. Because of this the storage time recommendation outlined by the World Health Organization should be complied with. The WHO recommends storing compost for 1.5-2 years in climates with ambient temperatures of 2-20°C (35-68°F), and for at least 1 year with ambient temperatures of 20-35°C (68-95°F). If these storage durations are obtainable the use of the double vault composting latrine is an economical and achievable solution to sanitation while conserving water resources.
Resumo:
The characteristics of the traditional linear economic model are high consumption, high emission and low efficiency. Economic development is still largely at the expense of the environment and requires a natural resource investment. This can realize rapid economic development but resource depletion and environmental pollution become increasingly serious. In the 1990's a new economic model, circular economics, began to enter our vision. The circular economy maximizes production and minimizes the impact of economic activities on the ecological environment through organizing the activities through the closed-loop feedback cycle of "resources - production - renewable resource". Circular economy is a better way to solve the contradictions between the economic development and resource shortages. Developing circular economy has become the major strategic initiatives to achieving sustainable development in countries all over the world. The evaluation of the development of circular economics is a necessary step for regional circular economy development. Having a quantitative evaluation of circular economy can better monitor and reveal the contradictions and problems in the process of the development of recycling economy. This thesis will: 1) Create an evaluation model framework and new types of industries and 2) Make an evaluation of the Shanghai circular economy currently to analyze the situation of Shanghai in the development of circular economy. I will then propose suggestions about the structure and development of Shanghai circular economy.
Resumo:
The separation of the valuable portion from the waste portion of an ore is an individual problem for every ore. However, the various methods for accomplishing this end, more or less classify themselves by the physical properties of the constituents of the ore. Most of the properties of minerals have been utilized in some way or other to affect the separation of the valuable from the invaluable parts. Practically nothing has been done so far with color and luster to attain this purpose. It is believed that the photo—electric cell could also be used in concentrating a certain class of ores which are not well suited to other methods.
Resumo:
A project to show whether a Warranty or Non-Warranty option would end up cheaper in twenty years.
Resumo:
Electrolytic silver refining was not perfected until the end of the nineteenth century. During the process of development, two systems of silver refining have come into prominence: the Moebius and the Thum types.
Resumo:
Multimodality – the interdependence of semiotic resources in text – is an existential element of today’s media. The term multimodality attends systematically to the social interpretation of a wide range of communicational forms used in meaning making. A primary focus of social- semiotic multimodal analysis is on mapping how modal resources are used by people in a given social context. In November 2012 the “Ola ke ase” catchphrase, which is a play on “Hola ¿qué hace?”, appeared for the first time in Spain and immediately has been adopted as a Twitter hashtag and an image macro series. Its viral spread on social networks has been tremendous, being a trending topic in various Spanish-speaking countries. The objective of analysis is how language and image work together in the “Ola ke ase” meme. The interplay between text and image in one of the original memes and some of its variations is quantitatively analysed applying a social-semiotic approach. Results demonstrate how the “Ola ke ase” meme functions through its multimodal character and the non-standard orthography. The spread of uncountable variations of the meme shows the social process that goes on in the meaning making of the semiotic elements.
Resumo:
Objective: To determine how a clinician’s background knowledge, their tasks, and displays of information interact to affect the clinician’s mental model. Design: Repeated Measure Nested Experimental Design Population, Sample, Setting: Populations were gastrointestinal/internal medicine physicians and nurses within the greater Houston area. A purposeful sample of 24 physicians and 24 nurses were studied in 2003. Methods: Subjects were randomized to two different displays of two different mock medical records; one that contained highlighted patient information and one that contained non-highlighted patient information. They were asked to read and summarize their understanding of the patients aloud. Propositional analysis was used to understand their comprehension of the patients. Findings: Different mental models were found between physicians and nurses given the same display of information. The information they shared was very minor compared to the variance in their mental models. There was additionally more variance within the nursing mental models than the physician mental models given different displays of the same information. Statistically, there was no interaction effect between the display of information and clinician type. Only clinician type could account for the differences in the clinician comprehension and thus their mental models of the cases. Conclusion: The factors that may explain the variance within and between the clinician models are clinician type, and only in the nursing group, the use of highlighting.
Resumo:
Inactivation by allelic exchange in clinical isolates of the emerging nosocomial pathogen Enterococcus faecium has been hindered by lack of efficient tools, and, in this study, transformation of clinical isolates was found to be particularly problematic. For this reason, a vector for allelic replacement (pTEX5500ts) was constructed that includes (i) the pWV01-based gram-positive repAts replication region, which is known to confer a high degree of temperature intolerance, (ii) Escherichia coli oriR from pUC18, (iii) two extended multiple-cloning sites located upstream and downstream of one of the marker genes for efficient cloning of flanking regions for double-crossover mutagenesis, (iv) transcriptional terminator sites to terminate undesired readthrough, and (v) a synthetic extended promoter region containing the cat gene for allelic exchange and a high-level gentamicin resistance gene, aph(2'')-Id, to distinguish double-crossover recombination, both of which are functional in gram-positive and gram-negative backgrounds. To demonstrate the functionality of this vector, the vector was used to construct an acm (encoding an adhesin to collagen from E. faecium) deletion mutant of a poorly transformable multidrug-resistant E. faecium endocarditis isolate, TX0082. The acm-deleted strain, TX6051 (TX0082Deltaacm), was shown to lack Acm on its surface, which resulted in the abolishment of the collagen adherence phenotype observed in TX0082. A mobilizable derivative (pTEX5501ts) that contains oriT of Tn916 to facilitate conjugative transfer from the transformable E. faecalis strain JH2Sm::Tn916 to E. faecium was also constructed. Using this vector, the acm gene of a nonelectroporable E. faecium wound isolate was successfully interrupted. Thus, pTEX5500ts and its mobilizable derivative demonstrated their roles as important tools by helping to create the first reported allelic replacement in E. faecium; the constructed this acm deletion mutant will be useful for assessing the role of acm in E. faecium pathogenesis using animal models.