12 resultados para Methods engineering.

em Digital Commons at Florida International University


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A major consequence of contamination at the local level’s population as it relates to environmental health and environmental engineering is childhood lead poisoning. Environmental contamination is one of the pressing environmental concerns facing the world today. Current approaches often focus on large contaminated industrial size sites that are designated by regulatory agencies for site remediation. Prior to this study, there were no known published studies conducted at the local and smaller scale, such as neighborhoods, where often much of the contamination is present to remediate. An environmental health study of local lead-poisoning data in Liberty City, Little Haiti and eastern Little Havana in Miami-Dade County, Florida accounted for a disproportionately high number of the county’s reported childhood lead poisoning cases. An engineering system was developed and designed for a comprehensive risk management methodology that is distinctively applicable to the geographical and environmental conditions of Miami-Dade County, Florida. Furthermore, a scientific approach for interpreting environmental health concerns, while involving detailed environmental engineering control measures and methods for site remediation in contained media was developed for implementation. Test samples were obtained from residents and sites in those specific communities in Miami-Dade County, Florida (Gasana and Chamorro 2002). Currently lead does not have an Oral Assessment, Inhalation Assessment, and Oral Slope Factor; variables that are required to run a quantitative risk assessment. However, various institutional controls from federal agencies’ standards and regulation for contaminated lead in media yield adequate maximum concentration limits (MCLs). For this study an MCL of .0015 (mg/L) was used. A risk management approach concerning contaminated media involving lead demonstrates that the linkage of environmental health and environmental engineering can yield a feasible solution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In topographically flat wetlands, where shallow water table and conductive soil may develop as a result of wet and dry seasons, the connection between surface water and groundwater is not only present, but perhaps the key factor dominating the magnitude and direction of water flux. Due to their complex characteristics, modeling waterflow through wetlands using more realistic process formulations (integrated surface-ground water and vegetative resistance) is an actual necessity. This dissertation focused on developing an integrated surface – subsurface hydrologic simulation numerical model by programming and testing the coupling of the USGS MODFLOW-2005 Groundwater Flow Process (GWF) package (USGS, 2005) with the 2D surface water routing model: FLO-2D (O’Brien et al., 1993). The coupling included the necessary procedures to numerically integrate and verify both models as a single computational software system that will heretofore be referred to as WHIMFLO-2D (Wetlands Hydrology Integrated Model). An improved physical formulation of flow resistance through vegetation in shallow waters based on the concept of drag force was also implemented for the simulations of floodplains, while the use of the classical methods (e.g., Manning, Chezy, Darcy-Weisbach) to calculate flow resistance has been maintained for the canals and deeper waters. A preliminary demonstration exercise WHIMFLO-2D in an existing field site was developed for the Loxahatchee Impoundment Landscape Assessment (LILA), an 80 acre area, located at the Arthur R. Marshall Loxahatchee National Wild Life Refuge in Boynton Beach, Florida. After applying a number of simplifying assumptions, results have illustrated the ability of the model to simulate the hydrology of a wetland. In this illustrative case, a comparison between measured and simulated stages level showed an average error of 0.31% with a maximum error of 2.8%. Comparison of measured and simulated groundwater head levels showed an average error of 0.18% with a maximum of 2.9%. The coupling of FLO-2D model with MODFLOW-2005 model and the incorporation of the dynamic effect of flow resistance due to vegetation performed in the new modeling tool WHIMFLO-2D is an important contribution to the field of numerical modeling of hydrologic flow in wetlands.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation evaluated the feasibility of using commercially available immortalized cell lines in building a tissue engineered in vitro blood-brain barrier (BBB) co-culture model for preliminary drug development studies. Mouse endothelial cell line and rat astrocyte cell lines purchased from American Type Culture Collections (ATCC) were the building blocks of the co-culture model. An astrocyte derived acellular extracellular matrix (aECM) was introduced in the co-culture model to provide a novel in vitro biomimetic basement membrane for the endothelial cells to form endothelial tight junctions. Trans-endothelial electrical resistance (TEER) and solute mass transport studies were engaged to quantitatively evaluate the tight junction formation on the in-vitro BBB models. Immuno-fluorescence microscopy and Western Blot analysis were used to qualitatively verify the in vitro expression of occludin, one of the earliest discovered tight junction proteins. Experimental data from a total of 12 experiments conclusively showed that the novel BBB in vitro co-culture model with the astrocyte derived aECM (CO+aECM) was promising in terms of establishing tight junction formation represented by TEER values, transport profiles and tight junction protein expression when compared with traditional co-culture (CO) model setups and endothelial cells cultured alone. Experimental data were also found to be comparable with several existing in vitro BBB models built from various methods. In vitro colorimetric sulforhodamine B (SRB) assay revealed that the co-cultured samples with aECM resulted in less cell loss on the basal sides of the insert membranes than that from traditional co-culture samples. The novel tissue engineering approach using immortalized cell lines with the addition of aECM was proven to be a relevant alternative to the traditional BBB in vitro modeling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The major objectives of this dissertation were to develop optimal spatial techniques to model the spatial-temporal changes of the lake sediments and their nutrients from 1988 to 2006, and evaluate the impacts of the hurricanes occurred during 1998–2006. Mud zone reduced about 10.5% from 1988 to 1998, and increased about 6.2% from 1998 to 2006. Mud areas, volumes and weight were calculated using validated Kriging models. From 1988 to 1998, mud thicknesses increased up to 26 cm in the central lake area. The mud area and volume decreased about 13.78% and 10.26%, respectively. From 1998 to 2006, mud depths declined by up to 41 cm in the central lake area, mud volume reduced about 27%. Mud weight increased up to 29.32% from 1988 to 1998, but reduced over 20% from 1998 to 2006. The reduction of mud sediments is likely due to re-suspension and redistribution by waves and currents produced by large storm events, particularly Hurricanes Frances and Jeanne in 2004 and Wilma in 2005. Regression, kriging, geographically weighted regression (GWR) and regression-kriging models have been calibrated and validated for the spatial analysis of the sediments TP and TN of the lake. GWR models provide the most accurate predictions for TP and TN based on model performance and error analysis. TP values declined from an average of 651 to 593 mg/kg from 1998 to 2006, especially in the lake’s western and southern regions. From 1988 to 1998, TP declined in the northern and southern areas, and increased in the central-western part of the lake. The TP weights increased about 37.99%–43.68% from 1988 to 1998 and decreased about 29.72%–34.42% from 1998 to 2006. From 1988 to 1998, TN decreased in most areas, especially in the northern and southern lake regions; western littoral zone had the biggest increase, up to 40,000 mg/kg. From 1998 to 2006, TN declined from an average of 9,363 to 8,926 mg/kg, especially in the central and southern regions. The biggest increases occurred in the northern lake and southern edge areas. TN weights increased about 15%–16.2% from 1988 to 1998, and decreased about 7%–11% from 1998 to 2006.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The adverse health effects of long-term exposure to lead are well established, with major uptake into the human body occurring mainly through oral ingestion by young children. Lead-based paint was frequently used in homes built before 1978, particularly in inner-city areas. Minority populations experience the effects of lead poisoning disproportionately. ^ Lead-based paint abatement is costly. In the United States, residents of about 400,000 homes, occupied by 900,000 young children, lack the means to correct lead-based paint hazards. The magnitude of this problem demands research on affordable methods of hazard control. One method is encapsulation, defined as any covering or coating that acts as a permanent barrier between the lead-based paint surface and the environment. ^ Two encapsulants were tested for reliability and effective life span through an accelerated lifetime experiment that applied stresses exceeding those encountered under normal use conditions. The resulting time-to-failure data were used to extrapolate the failure time under conditions of normal use. Statistical analysis and models of the test data allow forecasting of long-term reliability relative to the 20-year encapsulation requirement. Typical housing material specimens simulating walls and doors coated with lead-based paint were overstressed before encapsulation. A second, un-aged set was also tested. Specimens were monitored after the stress test with a surface chemical testing pad to identify the presence of lead breaking through the encapsulant. ^ Graphical analysis proposed by Shapiro and Meeker and the general log-linear model developed by Cox were used to obtain results. Findings for the 80% reliability time to failure varied, with close to 21 years of life under normal use conditions for encapsulant A. The application of product A on the aged gypsum and aged wood substrates yielded slightly lower times. Encapsulant B had an 80% reliable life of 19.78 years. ^ This study reveals that encapsulation technologies can offer safe and effective control of lead-based paint hazards and may be less expensive than other options. The U.S. Department of Health and Human Services and the CDC are committed to eliminating childhood lead poisoning by 2010. This ambitious target is feasible, provided there is an efficient application of innovative technology, a goal to which this study aims to contribute. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Trenchless methods have been considered to be a viable solution for pipeline projects in urban areas. Their applicability in pipeline projects is expected to increase with the rapid advancements in technology and emerging concerns regarding social costs related to trenching methods. Selecting appropriate project delivery system (PDS) is a key to the success of trenchless projects. To ensure success of the project, the selected project delivery should be tailored to trenchless project specific characteristics and owner needs, since the effectiveness of project delivery systems differs based on different project characteristics and owners requirements. Since different trenchless methods have specific characteristics such rate of installation, lengths of installation, and accuracy, the same project delivery systems may not be equally effective for different methods. The intent of this paper is to evaluate the appropriateness of different PDS for different trenchless methods. PDS are examined through a structured decision-making process called Fuzzy Delivery System Selection Model (FDSSM). The process of incorporating the impacts of: (a) the characteristics of trenchless projects and (b) owners’ needs in the FDSSM is performed by collecting data using questionnaires deployed to professionals involved in the trenchless industry in order to determine the importance of delivery systems selection attributes for different trenchless methods, and then analyzing this data. The sensitivity of PDS rankings with respect to trenchless methods is considered in order to evaluate whether similar project delivery systems are equally effective in different trenchless methods. The effectiveness of PDS with respect to attributes is defined as follows: a project delivery system is most effective with respect to an attribute (e.g., ability to control growth in costs ) if there is no project delivery system that is more effective than that PDS. The results of this study may assist trenchless project owners to select the appropriate PDS for the trenchless method selected.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fossil fuels constitute a significant fraction of the world's energy demand. The burning of fossil fuels emits huge amounts of carbon dioxide into the atmosphere. Therefore, the limited availability of fossil fuel resources and the environmental impact of their use require a change to alternative energy sources or carriers (such as hydrogen) in the foreseeable future. The development of methods to mitigate carbon dioxide emission into the atmosphere is equally important. Hence, extensive research has been carried out on the development of cost-effective technologies for carbon dioxide capture and techniques to establish hydrogen economy. Hydrogen is a clean energy fuel with a very high specific energy content of about 120MJ/kg and an energy density of 10Wh/kg. However, its potential is limited by the lack of environment-friendly production methods and a suitable storage medium. Conventional hydrogen production methods such as Steam-methane-reformation and Coal-gasification were modified by the inclusion of NaOH. The modified methods are thermodynamically more favorable and can be regarded as near-zero emission production routes. Further, suitable catalysts were employed to accelerate the proposed NaOH-assisted reactions and a relation between reaction yield and catalyst size has been established. A 1:1:1 molar mixture of LiAlH 4, NaNH2 and MgH2 were investigated as a potential hydrogen storage medium. The hydrogen desorption mechanism was explored using in-situ XRD and Raman Spectroscopy. Mesoporous metal oxides were assessed for CO2 capture at both power and non-power sectors. A 96.96% of mesoporous MgO (325 mesh size, surface area = 95.08 ± 1.5 m2/g) was converted to MgCO 3 at 350°C and 10 bars CO2. But the absorption capacity of 1h ball milled zinc oxide was low, 0.198 gCO2 /gZnO at 75°C and 10 bars CO2. Interestingly, 57% mass conversion of Fe and Fe 3O4 mixture to FeCO3 was observed at 200°C and 10 bars CO2. MgO, ZnO and Fe3O4 could be completely regenerated at 550°C, 250°C and 350°C respectively. Furthermore, the possible retrofit of MgO and a mixture of Fe and Fe3O 4 to a 300 MWe coal-fired power plant and iron making industry were also evaluated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The redevelopment of Brownfields has taken off in the 1990s, supported by federal and state incentives, and largely accomplished by local initiatives. Brownfields redevelopment has several associated benefits. These include the revitalization of inner-city neighborhoods, creation of jobs, stimulation of tax revenues, greater protection of public health and natural resources, the renewal and reuse existing civil infrastructure and Greenfields protection. While these benefits are numerous, the obstacles to Brownfields redevelopment are also very much alive. Redevelopment issues typically embrace a host of financial and legal liability concerns, technical and economic constraints, competing objectives, and uncertainties arising from inadequate site information. Because the resources for Brownfields redevelopment are usually limited, local programs will require creativity in addressing these existing obstacles in a manner that extends their limited resources for returning Brownfields to productive uses. Such programs may benefit from a structured and defensible decision framework to prioritize sites for redevelopment: one that incorporates the desired objectives, corresponding variables and uncertainties associated with Brownfields redevelopment. This thesis demonstrates the use of a decision analytic tool, Bayesian Influence Diagrams, and related decision analytic tools in developing quantitative decision models to evaluate and rank Brownfields sites on the basis of their redevelopment potential.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Engineering analysis in geometric models has been the main if not the only credible/reasonable tool used by engineers and scientists to resolve physical boundaries problems. New high speed computers have facilitated the accuracy and validation of the expected results. In practice, an engineering analysis is composed of two parts; the design of the model and the analysis of the geometry with the boundary conditions and constraints imposed on it. Numerical methods are used to resolve a large number of physical boundary problems independent of the model geometry. The time expended due to the computational process are related to the imposed boundary conditions and the well conformed geometry. Any geometric model that contains gaps or open lines is considered an imperfect geometry model and major commercial solver packages are incapable of handling such inputs. Others packages apply different kinds of methods to resolve this problems like patching or zippering; but the final resolved geometry may be different from the original geometry, and the changes may be unacceptable. The study proposed in this dissertation is based on a new technique to process models with geometrical imperfection without the necessity to repair or change the original geometry. An algorithm is presented that is able to analyze the imperfect geometric model with the imposed boundary conditions using a meshfree method and a distance field approximation to the boundaries. Experiments are proposed to analyze the convergence of the algorithm in imperfect models geometries and will be compared with the same models but with perfect geometries. Plotting results will be presented for further analysis and conclusions of the algorithm convergence

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation focused on developing an integrated surface – subsurface hydrologic simulation numerical model by programming and testing the coupling of the USGS MODFLOW-2005 Groundwater Flow Process (GWF) package (USGS, 2005) with the 2D surface water routing model: FLO-2D (O’Brien et al., 1993). The coupling included the necessary procedures to numerically integrate and verify both models as a single computational software system that will heretofore be referred to as WHIMFLO-2D (Wetlands Hydrology Integrated Model). An improved physical formulation of flow resistance through vegetation in shallow waters based on the concept of drag force was also implemented for the simulations of floodplains, while the use of the classical methods (e.g., Manning, Chezy, Darcy-Weisbach) to calculate flow resistance has been maintained for the canals and deeper waters. A preliminary demonstration exercise WHIMFLO-2D in an existing field site was developed for the Loxahatchee Impoundment Landscape Assessment (LILA), an 80 acre area, located at the Arthur R. Marshall Loxahatchee National Wild Life Refuge in Boynton Beach, Florida. After applying a number of simplifying assumptions, results have illustrated the ability of the model to simulate the hydrology of a wetland. In this illustrative case, a comparison between measured and simulated stages level showed an average error of 0.31% with a maximum error of 2.8%. Comparison of measured and simulated groundwater head levels showed an average error of 0.18% with a maximum of 2.9%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Concurrent software executes multiple threads or processes to achieve high performance. However, concurrency results in a huge number of different system behaviors that are difficult to test and verify. The aim of this dissertation is to develop new methods and tools for modeling and analyzing concurrent software systems at design and code levels. This dissertation consists of several related results. First, a formal model of Mondex, an electronic purse system, is built using Petri nets from user requirements, which is formally verified using model checking. Second, Petri nets models are automatically mined from the event traces generated from scientific workflows. Third, partial order models are automatically extracted from some instrumented concurrent program execution, and potential atomicity violation bugs are automatically verified based on the partial order models using model checking. Our formal specification and verification of Mondex have contributed to the world wide effort in developing a verified software repository. Our method to mine Petri net models automatically from provenance offers a new approach to build scientific workflows. Our dynamic prediction tool, named McPatom, can predict several known bugs in real world systems including one that evades several other existing tools. McPatom is efficient and scalable as it takes advantage of the nature of atomicity violations and considers only a pair of threads and accesses to a single shared variable at one time. However, predictive tools need to consider the tradeoffs between precision and coverage. Based on McPatom, this dissertation presents two methods for improving the coverage and precision of atomicity violation predictions: 1) a post-prediction analysis method to increase coverage while ensuring precision; 2) a follow-up replaying method to further increase coverage. Both methods are implemented in a completely automatic tool.