849 resultados para Building Information Modeling (BIM)


Relevância:

30.00% 30.00%

Publicador:

Resumo:

An information preservation (IP) method has been used to simulate many micro scale gas flows. It may efficiently reduce the statistical scatter inherent in conventional particle approaches such as the direct simulation Monte Carlo (DSMC) method. This paper reviews applications of IP to some benchmark problems. Comparison of the IP results with those given by experiment, DSMC, and the linearized Boltzmann equation, as well as the Navier-Stokes equations with a slip boundary condition, and the lattice Boltzmann equation, shows that the IP method is applicable to micro scale gas flows over the entire flow regime from continuum to free molecular.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sea level rise and inundation were stated to be the highest priorities in the community-developed Ocean Research Priorities Plan and Implementation Strategy in 2005. Although they remain stated priorities, very few resources have been allocated towards this challenge. Inundation poses a substantial risk to many coastal communities, and the risk is projected to increase because of continued development, changes in the frequency and intensity of inundation events, and acceleration in the rate of sea-level rise along our vulnerable shorelines. (PDF contains 4 pages) There is an increasing urgency for federal and state governments to focus on the local and regional levels and consistently provide the information, tools, and methods necessary for adaptation. Calls for action at all levels acknowledge that a viable response must engage federal, state and local expertise, perspectives, and resources in a coordinated and collaborative effort. A workshop held in December 2000 on coastal inundation and sea level rise proposes a shared framework that can help guide where investments should be made to enable states and local governments to assess impacts and initiate adaptation strategies over the next decade.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This panel will discuss the research being conducted, and the models being used in three current coastal EPA studies being conducted on ecosystem services in Tampa Bay, the Chesapeake Bay and the Coastal Carolinas. These studies are intended to provide a broader and more comprehensive approach to policy and decision-making affecting coastal ecosystems as well as provide an account of valued services that have heretofore been largely unrecognized. Interim research products, including updated and integrated spatial data, models and model frameworks, and interactive decision support systems will be demonstrated to engage potential users and to elicit feedback. It is anticipated that the near-term impact of the projects will be to increase the awareness by coastal communities and coastal managers of the implications of their actions and to foster partnerships for ecosystem services research and applications. (PDF contains 4 pages)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The South Carolina Coastal Information Network (SCCIN) emerged as a result of a number of coastal outreach institutions working in partnership to enhance coordination of the coastal community outreach efforts in South Carolina. This organized effort, led by the S.C. Sea Grant Consortium and its Extension Program, includes partners from federal and state agencies, regional government agencies, and private organizations seeking to coordinate and/or jointly deliver outreach programs that target coastal community constituents. The Network was officially formed in 2006 with the original intention of fostering intra-and inter- agency communication, coordination, and cooperation. Network partners include the S.C. Sea Grant Consortium, S.C. Department of Health and Environmental Control – Office of Ocean and Coastal Resource Management and Bureau of Water, S.C. Department of Natural Resources – ACE Basin National Estuarine Research Reserve, North Inlet-Winyah Bay National Estuarine Research Reserve, Clemson University Cooperative Extension Service and Carolina Clear, Berkeley-Charleston-Dorchester Council of Governments, Waccamaw Regional Council of Governments, Urban Land Institute of South Carolina, S.C. Department of Archives and History, the National Oceanic and Atmospheric Administration – Coastal Services Center and Hollings Marine Laboratory, Michaux Conservancy, Ashley-Cooper Stormwater Education Consortium, the Coastal Waccamaw Stormwater Education Consortium, the S.C. Chapter of the U.S. Green Building Council, and the Lowcountry Council of Governments. (PDF contains 3 pages)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantum computing offers powerful new techniques for speeding up the calculation of many classically intractable problems. Quantum algorithms can allow for the efficient simulation of physical systems, with applications to basic research, chemical modeling, and drug discovery; other algorithms have important implications for cryptography and internet security.

At the same time, building a quantum computer is a daunting task, requiring the coherent manipulation of systems with many quantum degrees of freedom while preventing environmental noise from interacting too strongly with the system. Fortunately, we know that, under reasonable assumptions, we can use the techniques of quantum error correction and fault tolerance to achieve an arbitrary reduction in the noise level.

In this thesis, we look at how additional information about the structure of noise, or "noise bias," can improve or alter the performance of techniques in quantum error correction and fault tolerance. In Chapter 2, we explore the possibility of designing certain quantum gates to be extremely robust with respect to errors in their operation. This naturally leads to structured noise where certain gates can be implemented in a protected manner, allowing the user to focus their protection on the noisier unprotected operations.

In Chapter 3, we examine how to tailor error-correcting codes and fault-tolerant quantum circuits in the presence of dephasing biased noise, where dephasing errors are far more common than bit-flip errors. By using an appropriately asymmetric code, we demonstrate the ability to improve the amount of error reduction and decrease the physical resources required for error correction.

In Chapter 4, we analyze a variety of protocols for distilling magic states, which enable universal quantum computation, in the presence of faulty Clifford operations. Here again there is a hierarchy of noise levels, with a fixed error rate for faulty gates, and a second rate for errors in the distilled states which decreases as the states are distilled to better quality. The interplay of of these different rates sets limits on the achievable distillation and how quickly states converge to that limit.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multi-step electron tunneling, or “hopping,” has become a fast-developing research field with studies ranging from theoretical modeling systems, inorganic complexes, to biological systems. In particular, the field is exploring hopping mechanisms in new proteins and protein complexes, as well as further understanding the classical biological hopping systems such as ribonuclease reductase, DNA photolyases, and photosystem II. Despite the plethora of natural systems, only a few biologically engineered systems exist. Engineered hopping systems can provide valuable information on key structural and electronic features, just like other kinds of biological model systems. Also, engineered systems can harness common biologic processes and utilize them for alternative reactions. In this thesis, two new hopping systems are engineered and characterized.

The protein Pseudomonas aeruginosa azurin is used as a building block to create the two new hopping systems. Besides being well studied and amenable to mutation, azurin already has been used to successfully engineer a hopping system. The two hopping systems presented in this thesis have a histidine-attached high potential rhenium 4,7-dimethyl-1,10-phenanthroline tricarbonyl [Re(dmp)(CO)3] + label which, when excited, acts as the initial electron acceptor. The metal donor is the type I copper of the azurin protein. The hopping intermediates are all tryptophan, an amino acid mutated into the azurin at select sites between the photoactive metal label and the protein metal site. One system exhibits an inter-molecular hopping through a protein dimer interface; the other system undergoes intra-molecular multi-hopping utilizing a tryptophan “wire.” The electron transfer reactions are triggered by excitation of the rhenium label and monitored by UV-Visible transient absorption, luminescence decays measurements, and time-resolved Infrared spectroscopy (TRIR). Both systems were structurally characterized by protein X-ray crystallography.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For damaging response, the force-displacement relationship of a structure is highly nonlinear and history-dependent. For satisfactory analysis of such behavior, it is important to be able to characterize and to model the phenomenon of hysteresis accurately. A number of models have been proposed for response studies of hysteretic structures, some of which are examined in detail in this thesis. There are two popular classes of models used in the analysis of curvilinear hysteretic systems. The first is of the distributed element or assemblage type, which models the physical behavior of the system by using well-known building blocks. The second class of models is of the differential equation type, which is based on the introduction of an extra variable to describe the history dependence of the system.

Owing to their mathematical simplicity, the latter models have been used extensively for various applications in structural dynamics, most notably in the estimation of the response statistics of hysteretic systems subjected to stochastic excitation. But the fundamental characteristics of these models are still not clearly understood. A response analysis of systems using both the Distributed Element model and the differential equation model when subjected to a variety of quasi-static and dynamic loading conditions leads to the following conclusion: Caution must be exercised when employing the models belonging to the second class in structural response studies as they can produce misleading results.

The Massing's hypothesis, originally proposed for steady-state loading, can be extended to general transient loading as well, leading to considerable simplification in the analysis of the Distributed Element models. A simple, nonparametric identification technique is also outlined, by means of which an optimal model representation involving one additional state variable is determined for hysteretic systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A general framework for multi-criteria optimal design is presented which is well-suited for automated design of structural systems. A systematic computer-aided optimal design decision process is developed which allows the designer to rapidly evaluate and improve a proposed design by taking into account the major factors of interest related to different aspects such as design, construction, and operation.

The proposed optimal design process requires the selection of the most promising choice of design parameters taken from a large design space, based on an evaluation using specified criteria. The design parameters specify a particular design, and so they relate to member sizes, structural configuration, etc. The evaluation of the design uses performance parameters which may include structural response parameters, risks due to uncertain loads and modeling errors, construction and operating costs, etc. Preference functions are used to implement the design criteria in a "soft" form. These preference functions give a measure of the degree of satisfaction of each design criterion. The overall evaluation measure for a design is built up from the individual measures for each criterion through a preference combination rule. The goal of the optimal design process is to obtain a design that has the highest overall evaluation measure - an optimization problem.

Genetic algorithms are stochastic optimization methods that are based on evolutionary theory. They provide the exploration power necessary to explore high-dimensional search spaces to seek these optimal solutions. Two special genetic algorithms, hGA and vGA, are presented here for continuous and discrete optimization problems, respectively.

The methodology is demonstrated with several examples involving the design of truss and frame systems. These examples are solved by using the proposed hGA and vGA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Experimental work was performed to delineate the system of digested sludge particles and associated trace metals and also to measure the interactions of sludge with seawater. Particle-size and particle number distributions were measured with a Coulter Counter. Number counts in excess of 1012 particles per liter were found in both the City of Los Angeles Hyperion mesophilic digested sludge and the Los Angeles County Sanitation Districts (LACSD) digested primary sludge. More than 90 percent of the particles had diameters less than 10 microns.

Total and dissolved trace metals (Ag, Cd, Cr, Cu, Fe, Mn, Ni, Pb, and Zn) were measured in LACSD sludge. Manganese was the only metal whose dissolved fraction exceeded one percent of the total metal. Sedimentation experiments for several dilutions of LACSD sludge in seawater showed that the sedimentation velocities of the sludge particles decreased as the dilution factor increased. A tenfold increase in dilution shifted the sedimentation velocity distribution by an order of magnitude. Chromium, Cu, Fe, Ni, Pb, and Zn were also followed during sedimentation. To a first approximation these metals behaved like the particles.

Solids and selected trace metals (Cr, Cu, Fe, Ni, Pb, and Zn) were monitored in oxic mixtures of both Hyperion and LACSD sludges for periods of 10 to 28 days. Less than 10 percent of the filterable solids dissolved or were oxidized. Only Ni was mobilized away from the particles. The majority of the mobilization was complete in less than one day.

The experimental data of this work were combined with oceanographic, biological, and geochemical information to propose and model the discharge of digested sludge to the San Pedro and Santa Monica Basins. A hydraulic computer simulation for a round buoyant jet in a density stratified medium showed that discharges of sludge effluent mixture at depths of 730 m would rise no more than 120 m. Initial jet mixing provided dilution estimates of 450 to 2600. Sedimentation analyses indicated that the solids would reach the sediments within 10 km of the point discharge.

Mass balances on the oxidizable chemical constituents in sludge indicated that the nearly anoxic waters of the basins would become wholly anoxic as a result of proposed discharges. From chemical-equilibrium computer modeling of the sludge digester and dilutions of sludge in anoxic seawater, it was predicted that the chemistry of all trace metals except Cr and Mn will be controlled by the precipitation of metal sulfide solids. This metal speciation held for dilutions up to 3000.

The net environmental impacts of this scheme should be salutary. The trace metals in the sludge should be immobilized in the anaerobic bottom sediments of the basins. Apparently no lifeforms higher than bacteria are there to be disrupted. The proposed deep-water discharges would remove the need for potentially expensive and energy-intensive land disposal alternatives and would end the discharge to the highly productive water near the ocean surface.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis addresses a series of topics related to the question of how people find the foreground objects from complex scenes. With both computer vision modeling, as well as psychophysical analyses, we explore the computational principles for low- and mid-level vision.

We first explore the computational methods of generating saliency maps from images and image sequences. We propose an extremely fast algorithm called Image Signature that detects the locations in the image that attract human eye gazes. With a series of experimental validations based on human behavioral data collected from various psychophysical experiments, we conclude that the Image Signature and its spatial-temporal extension, the Phase Discrepancy, are among the most accurate algorithms for saliency detection under various conditions.

In the second part, we bridge the gap between fixation prediction and salient object segmentation with two efforts. First, we propose a new dataset that contains both fixation and object segmentation information. By simultaneously presenting the two types of human data in the same dataset, we are able to analyze their intrinsic connection, as well as understanding the drawbacks of today’s “standard” but inappropriately labeled salient object segmentation dataset. Second, we also propose an algorithm of salient object segmentation. Based on our novel discoveries on the connections of fixation data and salient object segmentation data, our model significantly outperforms all existing models on all 3 datasets with large margins.

In the third part of the thesis, we discuss topics around the human factors of boundary analysis. Closely related to salient object segmentation, boundary analysis focuses on delimiting the local contours of an object. We identify the potential pitfalls of algorithm evaluation for the problem of boundary detection. Our analysis indicates that today’s popular boundary detection datasets contain significant level of noise, which may severely influence the benchmarking results. To give further insights on the labeling process, we propose a model to characterize the principles of the human factors during the labeling process.

The analyses reported in this thesis offer new perspectives to a series of interrelating issues in low- and mid-level vision. It gives warning signs to some of today’s “standard” procedures, while proposing new directions to encourage future research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Um Atlas Digital é um atlas que foi concebido através de técnicas computacionais e que, consequentemente, pode ser acessado através de um computador. Estruturado em um ambiente gráfico, além dos mapas, pode-se contar também com textos, fotografias, dados estatísticos, gráficos e tabelas. Por estar em meio digital existe a possibilidade de utilização de uma expressiva gama de temas, formatos e escalas. Nesta dissertação apresenta-se um protótipo de Atlas Digital como uma colaboração ao Sistema de Informação Municipal SIM, para o município de São João de Meriti, RJ. O referido SIM, que tem como meta os serviços municipais, visa atender ao próprio município, ao cidadão e a outros interessados na cidade, sendo as suas informações fundamentais para a melhoria da gestão das prefeituras. A pesquisa foi direcionada para o tema da habitabilidade, que consiste num conjunto de condições voltadas para a construção de habitat saudável, abrangendo temas físicos, psicológicos, sociais, culturais e ambientais. Dentro do tema habitabilidade, foram trabalhados os subtemas relativos a infraestrutura de abastecimento de água, esgoto, coleta de lixo, saúde e educação, esses subtemas foram confrontados entre si para uma comparação entre os bairros do município. O SIM e a habitabilidade são contemplados no plano diretor da cidade e representa uma grande parte da sustentação teórica da dissertação. A modelagem e implementação do protótipo do Atlas Digital foram feitas com auxílio de softwares gratuitos, sendo possível acessar mapas temáticos e outras informações sobre São João de Meriti

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Editores:Micaela Muñoz-Calvo; Carmen Buesa-Gómez