936 resultados para internal information flow


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information management is a key aspect of successful construction projects. Having inaccurate measurements and conflicting data can lead to costly mistakes, and vague quantities can ruin estimates and schedules. Building information modeling (BIM) augments a 3D model with a wide variety of information, which reduces many sources of error and can detect conflicts before they occur. Because new technology is often more complex, it can be difficult to effectively integrate it with existing business practices. In this paper, we will answer two questions: How can BIM add value to construction projects? and What lessons can be learned from other companies that use BIM or other similar technology? Previous research focused on the technology as if it were simply a tool, observing problems that occurred while integrating new technology into existing practices. Our research instead looks at the flow of information through a company and its network, seeing all the actors as part of an ecosystem. Building upon this idea, we proposed the metaphor of an information supply chain to illustrate how BIM can add value to a construction project. This paper then concludes with two case studies. The first case study illustrates a failure in the flow of information that could have prevented by using BIM. The second case study profiles a leading design firm that has used BIM products for many years and shows the real benefits of using this program.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation presents an effective quasi one-dimensional (1-D) computational simulation tool and a full two-dimensional (2-D) computational simulation methodology for steady annular/stratified internal condensing flows of pure vapor. These simulation tools are used to investigate internal condensing flows in both gravity as well as shear driven environments. Through accurate numerical simulations of the full two dimensional governing equations, results for laminar/laminar condensing flows inside mm-scale ducts are presented. The methodology has been developed using MATLAB/COMSOL platform and is currently capable of simulating film-wise condensation for steady (and unsteady flows). Moreover, a novel 1-D solution technique, capable of simulating condensing flows inside rectangular and circular ducts with different thermal boundary conditions is also presented. The results obtained from the 2-D scientific tool and 1-D engineering tool, are validated and synthesized with experimental results for gravity dominated flows inside vertical tube and inclined channel; and, also, for shear/pressure driven flows inside horizontal channels. Furthermore, these simulation tools are employed to demonstrate key differences of physics between gravity dominated and shear/pressure driven flows. A transition map that distinguishes shear driven, gravity driven, and “mixed” driven flow zones within the non-dimensional parameter space that govern these duct flows is presented along with the film thickness and heat transfer correlations that are valid in these zones. It has also been shown that internal condensing flows in a micro-meter scale duct experiences shear driven flow, even in different gravitational environments. The full 2-D steady computational tool has been employed to investigate the length of annularity. The result for a shear driven flow in a horizontal channel shows that in absence of any noise or pressure fluctuation at the inlet, the onset of non-annularity is partly due to insufficient shear at the liquid-vapor interface. This result is being further corroborated/investigated by R. R. Naik with the help of the unsteady simulation tool. The condensing flow results and flow physics understanding developed through these simulation tools will be instrumental in reliable design of modern micro-scale and spacebased thermal systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An optimizing compiler internal representation fundamentally affects the clarity, efficiency and feasibility of optimization algorithms employed by the compiler. Static Single Assignment (SSA) as a state-of-the-art program representation has great advantages though still can be improved. This dissertation explores the domain of single assignment beyond SSA, and presents two novel program representations: Future Gated Single Assignment (FGSA) and Recursive Future Predicated Form (RFPF). Both FGSA and RFPF embed control flow and data flow information, enabling efficient traversal program information and thus leading to better and simpler optimizations. We introduce future value concept, the designing base of both FGSA and RFPF, which permits a consumer instruction to be encountered before the producer of its source operand(s) in a control flow setting. We show that FGSA is efficiently computable by using a series T1/T2/TR transformation, yielding an expected linear time algorithm for combining together the construction of the pruned single assignment form and live analysis for both reducible and irreducible graphs. As a result, the approach results in an average reduction of 7.7%, with a maximum of 67% in the number of gating functions compared to the pruned SSA form on the SPEC2000 benchmark suite. We present a solid and near optimal framework to perform inverse transformation from single assignment programs. We demonstrate the importance of unrestricted code motion and present RFPF. We develop algorithms which enable instruction movement in acyclic, as well as cyclic regions, and show the ease to perform optimizations such as Partial Redundancy Elimination on RFPF.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The area of microfluidics has increased in popularity with such fields as MEMS, microreactors, microscaleheat exchangers, etc. A comprehensive understanding of dissipation mechanisms for fluid flow in microchannels is required to accurately predict the behavior in these small systems. Tests were performed using a constant pressure potential created by two immiscible fluids juxtaposed in a microchannel. This study focused on the flow and dissipation mechanisms in round and square microchannels. There are four major dissipation mechanisms in slug flow; wall shear, dissipation at the contact line, menisci interaction and the stretching of the interface. A force balance between the internal driving potential, viscous drag and interface stretching was used to develop a model for the prediction of the velocity of a bislug in a microchannel. Interface stretching is a dissipation mechanism that has been included due to the unique system properties and becomes increasingly more important as the bislug decreases in length.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Riparian ecology plays an important part in the filtration of sediments from upland agricultural lands. The focus of this work makes use of multispectral high spatial resolution remote sensing imagery (Quickbird by Digital Globe) and geographic information systems (GIS) to characterize significant riparian attributes in the USDA’s experimental watershed, Goodwin Creek, located in northern Mississippi. Significant riparian filter characteristics include the width of the strip, vegetation properties, soil properties, topography, and upland land use practices. The land use and vegetation classes are extracted from the remotely sensed image with a supervised maximum likelihood classification algorithm. Accuracy assessments resulted in an acceptable overall accuracy of 84 percent. In addition to sensing riparian vegetation characteristics, this work addresses the issue of concentrated flow bypassing a riparian filter. Results indicate that Quickbird multispectral remote sensing and GIS data are capable of determining riparian impact on filtering sediment. Quickbird imagery is a practical solution for land managers to monitor the effectiveness of riparian filtration in an agricultural watershed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: Perforating arteries are commonly involved during the surgical dissection and clipping of intracranial aneurysms. Occlusion of perforating arteries is responsible for ischemic infarction and poor outcome. The goal of this study is to describe the usefulness of near-infrared indocyanine green videoangiography (ICGA) for the intraoperative assessment of blood flow in perforating arteries that are visible in the surgical field during clipping of intracranial aneurysms. In addition, we analyzed the incidence of perforating vessels involved during the aneurysm surgery and the incidence of ischemic infarct caused by compromised small arteries. METHODS: Sixty patients with 64 aneurysms were surgically treated and prospectively included in this study. Intraoperative ICGA was performed using a surgical microscope (Carl Zeiss Co., Oberkochen, Germany) with integrated ICGA technology. The presence and involvement of perforating arteries were analyzed in the microsurgical field during surgical dissection and clip application. Assessment of vascular patency after clipping was also investigated. Only those small arteries that were not visible on preoperative digital subtraction angiography were considered for analysis. RESULTS: The ICGA was able to visualize flow in all patients in whom perforating vessels were found in the microscope field. Among 36 patients whose perforating vessels were visible on ICGA, 11 (30%) presented a close relation between the aneurysm and perforating arteries. In one (9%) of these 11 patients, ICGA showed occlusion of a P1 perforating artery after clip application, which led to immediate correction of the clip confirmed by immediate reestablishment of flow visible with ICGA without clinical consequences. Four patients (6.7%) presented with postoperative perforating artery infarct, three of whom had perforating arteries that were not visible or distant from the aneurysm. CONCLUSION: The involvement of perforating arteries during clip application for aneurysm occlusion is a usual finding. Intraoperative ICGA may provide visual information with regard to the patency of these small vessels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This report is a dissertation proposal that focuses on the energy balance within an internal combustion engine with a unique coolant-based waste heat recovery system. It has been predicted by the U.S. Energy Information Administration that the transportation sector in the United States will consume approximately 15 million barrels per day in liquid fuels by the year 2025. The proposed coolant-based waste heat recovery technique has the potential to reduce the yearly usage of those liquid fuels by nearly 50 million barrels by only recovering even a modest 1% of the wasted energy within the coolant system. The proposed waste heat recovery technique implements thermoelectric generators on the outside cylinder walls of an internal combustion engine. For this research, one outside cylinder wall of a twin cylinder 26 horsepower water-cooled gasoline engine will be implemented with a thermoelectric generator surrogate material. The vertical location of these TEG surrogates along the water jacket will be varied along with the TEG surrogate thermal conductivity. The aim of this proposed dissertation is to attain empirical evidence of the impact, including energy distribution and cylinder wall temperatures, of installing TEGs in the water jacket area. The results can be used for future research on larger engines and will also assist with proper TEG selection to maximize energy recovery efficiencies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Power transformers are key components of the power grid and are also one of the most subjected to a variety of power system transients. The failure of a large transformer can cause severe monetary losses to a utility, thus adequate protection schemes are of great importance to avoid transformer damage and maximize the continuity of service. Computer modeling can be used as an efficient tool to improve the reliability of a transformer protective relay application. Unfortunately, transformer models presently available in commercial software lack completeness in the representation of several aspects such as internal winding faults, which is a common cause of transformer failure. It is also important to adequately represent the transformer at frequencies higher than the power frequency for a more accurate simulation of switching transients since these are a well known cause for the unwanted tripping of protective relays. This work develops new capabilities for the Hybrid Transformer Model (XFMR) implemented in ATPDraw to allow the representation of internal winding faults and slow-front transients up to 10 kHz. The new model can be developed using any of two sources of information: 1) test report data and 2) design data. When only test-report data is available, a higher-order leakage inductance matrix is created from standard measurements. If design information is available, a Finite Element Model is created to calculate the leakage parameters for the higher-order model. An analytical model is also implemented as an alternative to FEM modeling. Measurements on 15-kVA 240?/208Y V and 500-kVA 11430Y/235Y V distribution transformers were performed to validate the model. A transformer model that is valid for simulations for frequencies above the power frequency was developed after continuing the division of windings into multiple sections and including a higher-order capacitance matrix. Frequency-scan laboratory measurements were used to benchmark the simulations. Finally, a stability analysis of the higher-order model was made by analyzing the trapezoidal rule for numerical integration as used in ATP. Numerical damping was also added to suppress oscillations locally when discontinuities occurred in the solution. A maximum error magnitude of 7.84% was encountered in the simulated currents for different turn-to-ground and turn-to-turn faults. The FEM approach provided the most accurate means to determine the leakage parameters for the ATP model. The higher-order model was found to reproduce the short-circuit impedance acceptably up to about 10 kHz and the behavior at the first anti-resonant frequency was better matched with the measurements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lava flow modeling can be a powerful tool in hazard assessments; however, the ability to produce accurate models is usually limited by a lack of high resolution, up-to-date Digital Elevation Models (DEMs). This is especially obvious in places such as Kilauea Volcano (Hawaii), where active lava flows frequently alter the terrain. In this study, we use a new technique to create high resolution DEMs on Kilauea using synthetic aperture radar (SAR) data from the TanDEM-X (TDX) satellite. We convert raw TDX SAR data into a geocoded DEM using GAMMA software [Werner et al., 2000]. This process can be completed in several hours and permits creation of updated DEMs as soon as new TDX data are available. To test the DEMs, we use the Harris and Rowland [2001] FLOWGO lava flow model combined with the Favalli et al. [2005] DOWNFLOW model to simulate the 3-15 August 2011 eruption on Kilauea's East Rift Zone. Results were compared with simulations using the older, lower resolution 2000 SRTM DEM of Hawaii. Effusion rates used in the model are derived from MODIS thermal infrared satellite imagery. FLOWGO simulations using the TDX DEM produced a single flow line that matched the August 2011 flow almost perfectly, but could not recreate the entire flow field due to the relatively high DEM noise level. The issues with short model flow lengths can be resolved by filtering noise from the DEM. Model simulations using the outdated SRTM DEM produced a flow field that followed a different trajectory to that observed. Numerous lava flows have been emplaced at Kilauea since the creation of the SRTM DEM, leading the model to project flow lines in areas that have since been covered by fresh lava flows. These results show that DEMs can quickly become outdated on active volcanoes, but our new technique offers the potential to produce accurate, updated DEMs for modeling lava flow hazards.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Space-based (satellite, scientific probe, space station, etc.) and millimeter – to – microscale (such as are used in high power electronics cooling, weapons cooling in aircraft, etc.) condensers and boilers are shear/pressure driven. They are of increasing interest to system engineers for thermal management because flow boilers and flow condensers offer both high fluid flow-rate-specific heat transfer capacity and very low thermal resistance between the fluid and the heat exchange surface, so large amounts of heat may be removed using reasonably-sized devices without the need for excessive temperature differences. However, flow stability issues and degradation of performance of shear/pressure driven condensers and boilers due to non-desirable flow morphology over large portions of their lengths have mostly prevented their use in these applications. This research is part of an ongoing investigation seeking to close the gap between science and engineering by analyzing two key innovations which could help address these problems. First, it is recommended that the condenser and boiler be operated in an innovative flow configuration which provides a non-participating core vapor stream to stabilize the annular flow regime throughout the device length, accomplished in an energy-efficient manner by means of ducted vapor re-circulation. This is demonstrated experimentally. Second, suitable pulsations applied to the vapor entering the condenser or boiler (from the re-circulating vapor stream) greatly reduce the thermal resistance of the already effective annular flow regime. For experiments reported here, application of pulsations increased time-averaged heat-flux up to 900 % at a location within the flow condenser and up to 200 % at a location within the flow boiler, measured at the heat-exchange surface. Traditional fully condensing flows, reported here for comparison purposes, show similar heat-flux enhancements due to imposed pulsations over a range of frequencies. Shear/pressure driven condensing and boiling flow experiments are carried out in horizontal mm-scale channels with heat exchange through the bottom surface. The sides and top of the flow channel are insulated. The fluid is FC-72 from 3M Corporation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Increasing age and comorbidities among patients undergoing coronary artery bypass surgery (CABG) stimulates the exhaustive research for alternative grafts. No-React treatment should render the tissue resistant against degeneration and reduce early inflammatory response. The aim of the present study was an invasive assessment of the patency of No-React bovine internal mammary artery (NRIMA grafts) used as bypass conduit in CABG surgery. PATIENTS AND METHODS: Nineteen NRIMA grafts were used in 17 patients (2.9%) out of a total of 572 patients undergoing CABG surgery within a 12-month period. All intraoperative data were assessed and in-hospital outcome was analysed. Follow-up examination was performed 7.0+/-4.0 months after initial surgery, including clinical status and coronary angiography to assess patency of the NRIMA grafts. RESULTS: Average perioperative flow of all NRIMA grafts was 71+/-60 ml/min. One patient died in hospital due to a multi-organ failure. Four patients refused invasive assessment. Follow-up was complete in 12 patients with overall 13 NRIMA grafts. Nine NRIMA grafts (69.2%) were used for the right coronary system, two NRIMA grafts (15.4%) on the LAD and two on the circumflex artery. Graft patency was 23.1% and was independent of the intraoperative flow measurement. CONCLUSIONS: NRIMA grafts show a very low patency and cannot be recommended as coronary bypass graft conduits. Patency was independent of the perioperative flow, assessed by Doppler ultrasound. Because of this unsatisfying observation, this type of graft should be utilised as a last resource conduit and used only to revascularise less important target vessels, such as the end branches of the right coronary artery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Micro-scale, two-phase flow is found in a variety of devices such as Lab-on-a-chip, bio-chips, micro-heat exchangers, and fuel cells. Knowledge of the fluid behavior near the dynamic gas-liquid interface is required for developing accurate predictive models. Light is distorted near a curved gas-liquid interface preventing accurate measurement of interfacial shape and internal liquid velocities. This research focused on the development of experimental methods designed to isolate and probe dynamic liquid films and measure velocity fields near a moving gas-liquid interface. A high-speed, reflectance, swept-field confocal (RSFC) imaging system was developed for imaging near curved surfaces. Experimental studies of dynamic gas-liquid interface of micro-scale, two-phase flow were conducted in three phases. Dynamic liquid film thicknesses of segmented, two-phase flow were measured using the RSFC and compared to a classic film thickness deposition model. Flow fields near a steadily moving meniscus were measured using RSFC and particle tracking velocimetry. The RSFC provided high speed imaging near the menisci without distortion caused the gas-liquid interface. Finally, interfacial morphology for internal two-phase flow and droplet evaporation were measured using interferograms produced by the RSFC imaging technique. Each technique can be used independently or simultaneously when.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most recently discussion about the optimal treatment for different subsets of patients suffering from coronary artery disease has re-emerged, mainly because of the uncertainty caused by doctors and patients regarding the phenomenon of unpredictable early and late stent thrombosis. Surgical revascularization using multiple arterial bypass grafts has repeatedly proven its superiority compared to percutaneous intervention techniques, especially in patients suffering from left main stem disease and coronary 3-vessels disease. Several prospective randomized multicenter studies comparing early and mid-term results following PCI and CABG have been really restrictive, with respect to patient enrollment, with less than 5% of all patients treated during the same time period been enrolled. Coronary artery bypass grafting allows the most complete revascularization in one session, because all target coronary vessels larger than 1 mm can be bypassed in their distal segments. Once the patient has been turn-off for surgery, surgeons have to consider the most complete arterial revascularization in order to decrease the long-term necessity for re-revascularization; for instance patency rate of the left internal thoracic artery grafted to the distal part left anterior descending artery may be as high as 90-95% after 10 to 15 years. Early mortality following isolated CABG operation has been as low as 0.6 to 1% in the most recent period (reports from the University Hospital Berne and the University Hospital of Zurich); beside these excellent results, the CABG option seems to be less expensive than PCI with time, since the necessity for additional PCI is rather high following initial PCI, and the price of stent devices is still very high, particularly in Switzerland. Patients, insurance and experts in health care should be better and more honestly informed concerning the risk and costs of PCI and CABG procedures as well as about the much higher rate of subsequent interventions following PCI. Team approach for all patients in whom both options could be offered seems mandatory to avoid unbalanced information of the patients. Looking at the recent developments in transcatheter valve treatments, the revival of cardiological-cardiosurgical conferences seems to a good option to optimize the cooperation between the two medical specialties: cardiology and cardiac surgery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

he notion of outsourcing – making arrangements with an external entity for the provision of goods or services to supplement or replace internal efforts – has been around for centuries. The outsourcing of information systems (IS) is however a much newer concept but one which has been growing dramatically. This book attempts to synthesize what is known about IS outsourcing by dividing the subject into three interrelated parts: (1) Traditional Information Technology Outsourcing, (2) Information Technolgy Offshoring, and (3) Business Process Outsourcing. The book should be of interest to all academics and students in the field of Information Systems as well as corporate executives and professionals who seek a more profound analysis and understanding of the underlying factors and mechanisms of outsourcing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conventional debugging tools present developers with means to explore the run-time context in which an error has occurred. In many cases this is enough to help the developer discover the faulty source code and correct it. However, rather often errors occur due to code that has executed in the past, leaving certain objects in an inconsistent state. The actual run-time error only occurs when these inconsistent objects are used later in the program. So-called back-in-time debuggers help developers step back through earlier states of the program and explore execution contexts not available to conventional debuggers. Nevertheless, even back-in-time debuggers do not help answer the question, ``Where did this object come from?'' The Object-Flow Virtual Machine, which we have proposed in previous work, tracks the flow of objects to answer precisely such questions, but this VM does not provide dedicated debugging support to explore faulty programs. In this paper we present a novel debugger, called Compass, to navigate between conventional run-time stack-oriented control flow views and object flows. Compass enables a developer to effectively navigate from an object contributing to an error back-in-time through all the code that has touched the object. We present the design and implementation of Compass, and we demonstrate how flow-centric, back-in-time debugging can be used to effectively locate the source of hard-to-find bugs.