990 resultados para Context modeling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this research is to synthesize structural composites designed with particular areas defined with custom modulus, strength and toughness values in order to improve the overall mechanical behavior of the composite. Such composites are defined and referred to as 3D-designer composites. These composites will be formed from liquid crystalline polymers and carbon nanotubes. The fabrication process is a variation of rapid prototyping process, which is a layered, additive-manufacturing approach. Composites formed using this process can be custom designed by apt modeling methods for superior performance in advanced applications. The focus of this research is on enhancement of Young's modulus in order to make the final composite stiffer. Strength and toughness of the final composite with respect to various applications is also discussed. We have taken into consideration the mechanical properties of final composite at different fiber volume content as well as at different orientations and lengths of the fibers. The orientation of the LC monomers is supposed to be carried out using electric or magnetic fields. A computer program is modeled incorporating the Mori-Tanaka modeling scheme to generate the stiffness matrix of the final composite. The final properties are then deduced from the stiffness matrix using composite micromechanics. Eshelby's tensor, required to calculate the stiffness tensor using Mori-Tanaka method, is calculated using a numerical scheme that determines the components of the Eshelby's tensor (Gavazzi and Lagoudas 1990). The numerical integration is solved using Gaussian Quadrature scheme and is worked out using MATLAB as well. . MATLAB provides a good deal of commands and algorithms that can be used efficiently to elaborate the continuum of the formula to its extents. Graphs are plotted using different combinations of results and parameters involved in finding these results

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel mechanistic model for the saccharification of cellulose and hemicellulose is utilized to predict the products of hydrolysis over a range of enzyme loadings and times. The mechanistic model considers the morphology of the substrate and the kinetics of enzymes to optimize enzyme concentrations for the enzymatic hydrolysis of cellulose and hemicellulose simultaneously. Substrates are modeled based on their fraction of accessible sites, glucan content, xylan content, and degree of polymerizations. This enzyme optimization model takes into account the kinetics of six core enzymes for lignocellulose hydrolysis: endoglucanase I (EG1), cellobiohydrolase I (CBH1), cellobiohydrolase II (CBH2), and endo-xylanase (EX) from Trichoderma reesei; β-glucosidase (BG), and β-xylosidase (BX) from Aspergillus niger. The model employs the synergistic action of these enzymes to predict optimum enzyme concentrations for hydrolysis of Avicel and ammonia fiber explosion (AFEX) pretreated corn stover. Glucan, glucan + xylan, glucose and glucose + xylose conversion predictions are given over a range of mass fractions of enzymes, and a range of enzyme loadings. Simulation results are compared with optimizations using statistically designed experiments. BG and BX are modeled in solution at later time points to predict the effect on glucose conversion and xylose conversion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study of volcano deformation data can provide information on magma processes and help assess the potential for future eruptions. In employing inverse deformation modeling on these data, we attempt to characterize the geometry, location and volume/pressure change of a deformation source. Techniques currently used to model sheet intrusions (e.g., dikes and sills) often require significant a priori assumptions about source geometry and can require testing a large number of parameters. Moreover, surface deformations are a non-linear function of the source geometry and location. This requires the use of Monte Carlo inversion techniques which leads to long computation times. Recently, ‘displacement tomography’ models have been used to characterize magma reservoirs by inverting source deformation data for volume changes using a grid of point sources in the subsurface. The computations involved in these models are less intensive as no assumptions are made on the source geometry and location, and the relationship between the point sources and the surface deformation is linear. In this project, seeking a less computationally intensive technique for fracture sources, we tested if this displacement tomography method for reservoirs could be used for sheet intrusions. We began by simulating the opening of three synthetic dikes of known geometry and location using an established deformation model for fracture sources. We then sought to reproduce the displacements and volume changes undergone by the fractures using the sources employed in the tomography methodology. Results of this validation indicate the volumetric point sources are not appropriate for locating fracture sources, however they may provide useful qualitative information on volume changes occurring in the surrounding rock, and therefore indirectly indicate the source location.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Implementation of stable aeroelastic models with the ability to capture the complex features of Multi concept smartblades is a prime step in reducing the uncertainties that come along with blade dynamics. The numerical simulations of fluid structure interaction can thus be used to test a realistic scenarios comprising of full-scale blades at a reasonably low computational cost. A code which was a combination of two advanced numerical models was designed and was run with the help of paralell HPC supercomputer platform. The first model was based on a variation of dimensional reduction technique proposed by Hodges and Yu. This model was the one to record the structural response of heterogenous composite blades. This technique reduces the geometrical complexities of the heterogenous blade section into a stiffness matrix for an equivalent beam. This derived equivalent 1-D strain energy matrix is similar to the actual 3-D strain energy matrix in an asymptotic sense. As this 1-D matrix helps in accurately modeling the blade structure as a 1-D finite element problem, this substantially redues the computational effort and subsequently the computational cost that are required to model the structural dynamics at each step. Second model comprises of implementation of the Blade Element Momentum Theory. In this approach we map all the velocities and the forces with the help of orthogonal matrices that help in capturing the large deformations and the effects of rotations in calculating the aerodynamic forces. This ultimately helps us to take into account the complex flexo torsional deformations. In this thesis we have succesfully tested these computayinal tools developed by MTU’s research team lead by for the aero elastic analysis of wind-turbine blades. The validation in this thesis is majorly based on several experiments done on NREL-5MW blade, as this is widely accepted as a benchmark blade in the wind industry. Along with the use of this innovative model the internal blade structure was also changed to add up to the existing benefits of the already advanced numerical models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Colloid self-assembly under external control is a new route to fabrication of advanced materials with novel microstructures and appealing functionalities. The kinetic processes of colloidal self-assembly have attracted great interests also because they are similar to many atomic level kinetic processes of materials. In the past decades, rapid technological progresses have been achieved on producing shape-anisotropic, patchy, core-shell structured particles and particles with electric/magnetic charges/dipoles, which greatly enriched the self-assembled structures. Multi-phase carrier liquids offer new route to controlling colloidal self-assembly. Therefore, heterogeneity is the essential characteristics of colloid system, while so far there still lacks a model that is able to efficiently incorporate these possible heterogeneities. This thesis is mainly devoted to development of a model and computational study on the complex colloid system through a diffuse-interface field approach (DIFA), recently developed by Wang et al. This meso-scale model is able to describe arbitrary particle shape and arbitrary charge/dipole distribution on the surface or body of particles. Within the framework of DIFA, a Gibbs-Duhem-type formula is introduced to treat Laplace pressure in multi-liquid-phase colloidal system and it obeys Young-Laplace equation. The model is thus capable to quantitatively study important capillarity related phenomena. Extensive computer simulations are performed to study the fundamental behavior of heterogeneous colloidal system. The role of Laplace pressure is revealed in determining the mechanical equilibrium of shape-anisotropic particles at fluid interfaces. In particular, it is found that the Laplace pressure plays a critical role in maintaining the stability of capillary bridges between close particles, which sheds light on a novel route to in situ firming compact but fragile colloidal microstructures via capillary bridges. Simulation results also show that competition between like-charge repulsion, dipole-dipole interaction and Brownian motion dictates the degree of aggregation of heterogeneously charged particles. Assembly and alignment of particles with magnetic dipoles under external field is studied. Finally, extended studies on the role of dipole-dipole interaction are performed for ferromagnetic and ferroelectric domain phenomena. The results reveal that the internal field generated by dipoles competes with external field to determine the dipole-domain evolution in ferroic materials.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Measurement and modeling techniques were developed to improve over-water gaseous air-water exchange measurements for persistent bioaccumulative and toxic chemicals (PBTs). Analytical methods were applied to atmospheric measurements of hexachlorobenzene (HCB), polychlorinated biphenyls (PCBs), and polybrominated diphenyl ethers (PBDEs). Additionally, the sampling and analytical methods are well suited to study semivolatile organic compounds (SOCs) in air with applications related to secondary organic aerosol formation, urban, and indoor air quality. A novel gas-phase cleanup method is described for use with thermal desorption methods for analysis of atmospheric SOCs using multicapillary denuders. The cleanup selectively removed hydrogen-bonding chemicals from samples, including much of the background matrix of oxidized organic compounds in ambient air, and thereby improved precision and method detection limits for nonpolar analytes. A model is presented that predicts gas collection efficiency and particle collection artifact for SOCs in multicapillary denuders using polydimethylsiloxane (PDMS) sorbent. An approach is presented to estimate the equilibrium PDMS-gas partition coefficient (Kpdms) from an Abraham solvation parameter model for any SOC. A high flow rate (300 L min-1) multicapillary denuder was designed for measurement of trace atmospheric SOCs. Overall method precision and detection limits were determined using field duplicates and compared to the conventional high-volume sampler method. The high-flow denuder is an alternative to high-volume or passive samplers when separation of gas and particle-associated SOCs upstream of a filter and short sample collection time are advantageous. A Lagrangian internal boundary layer transport exchange (IBLTE) Model is described. The model predicts the near-surface variation in several quantities with fetch in coastal, offshore flow: 1) modification in potential temperature and gas mixing ratio, 2) surface fluxes of sensible heat, water vapor, and trace gases using the NOAA COARE Bulk Algorithm and Gas Transfer Model, 3) vertical gradients in potential temperature and mixing ratio. The model was applied to interpret micrometeorological measurements of air-water exchange flux of HCB and several PCB congeners in Lake Superior. The IBLTE Model can be applied to any scalar, including water vapor, carbon dioxide, dimethyl sulfide, and other scalar quantities of interest with respect to hydrology, climate, and ecosystem science.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Managed lane strategies are innovative road operation schemes for addressing congestion problems. These strategies operate a lane (lanes) adjacent to a freeway that provides congestion-free trips to eligible users, such as transit or toll-payers. To ensure the successful implementation of managed lanes, the demand on these lanes need to be accurately estimated. Among different approaches for predicting this demand, the four-step demand forecasting process is most common. Managed lane demand is usually estimated at the assignment step. Therefore, the key to reliably estimating the demand is the utilization of effective assignment modeling processes. Managed lanes are particularly effective when the road is functioning at near-capacity. Therefore, capturing variations in demand and network attributes and performance is crucial for their modeling, monitoring and operation. As a result, traditional modeling approaches, such as those used in static traffic assignment of demand forecasting models, fail to correctly predict the managed lane demand and the associated system performance. The present study demonstrates the power of the more advanced modeling approach of dynamic traffic assignment (DTA), as well as the shortcomings of conventional approaches, when used to model managed lanes in congested environments. In addition, the study develops processes to support an effective utilization of DTA to model managed lane operations. Static and dynamic traffic assignments consist of demand, network, and route choice model components that need to be calibrated. These components interact with each other, and an iterative method for calibrating them is needed. In this study, an effective standalone framework that combines static demand estimation and dynamic traffic assignment has been developed to replicate real-world traffic conditions. With advances in traffic surveillance technologies collecting, archiving, and analyzing traffic data is becoming more accessible and affordable. The present study shows how data from multiple sources can be integrated, validated, and best used in different stages of modeling and calibration of managed lanes. Extensive and careful processing of demand, traffic, and toll data, as well as proper definition of performance measures, result in a calibrated and stable model, which closely replicates real-world congestion patterns, and can reasonably respond to perturbations in network and demand properties.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The low-frequency electromagnetic compatibility (EMC) is an increasingly important aspect in the design of practical systems to ensure the functional safety and reliability of complex products. The opportunities for using numerical techniques to predict and analyze system’s EMC are therefore of considerable interest in many industries. As the first phase of study, a proper model, including all the details of the component, was required. Therefore, the advances in EMC modeling were studied with classifying analytical and numerical models. The selected model was finite element (FE) modeling, coupled with the distributed network method, to generate the model of the converter’s components and obtain the frequency behavioral model of the converter. The method has the ability to reveal the behavior of parasitic elements and higher resonances, which have critical impacts in studying EMI problems. For the EMC and signature studies of the machine drives, the equivalent source modeling was studied. Considering the details of the multi-machine environment, including actual models, some innovation in equivalent source modeling was performed to decrease the simulation time dramatically. Several models were designed in this study and the voltage current cube model and wire model have the best result. The GA-based PSO method is used as the optimization process. Superposition and suppression of the fields in coupling the components were also studied and verified. The simulation time of the equivalent model is 80-100 times lower than the detailed model. All tests were verified experimentally. As the application of EMC and signature study, the fault diagnosis and condition monitoring of an induction motor drive was developed using radiated fields. In addition to experimental tests, the 3DFE analysis was coupled with circuit-based software to implement the incipient fault cases. The identification was implemented using ANN for seventy various faulty cases. The simulation results were verified experimentally. Finally, the identification of the types of power components were implemented. The results show that it is possible to identify the type of components, as well as the faulty components, by comparing the amplitudes of their stray field harmonics. The identification using the stray fields is nondestructive and can be used for the setups that cannot go offline and be dismantled

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Pleistocene carbonate rock Biscayne Aquifer of south Florida contains laterally-extensive bioturbated ooltic zones characterized by interconnected touching-vug megapores that channelize most flow and make the aquifer extremely permeable. Standard petrophysical laboratory techniques may not be capable of accurately measuring such high permeabilities. Instead, innovative procedures that can measure high permeabilities were applied. These fragile rocks cannot easily be cored or cut to shapes convenient for conducting permeability measurements. For the laboratory measurement, a 3D epoxy-resin printed rock core was produced from computed tomography data obtained from an outcrop sample. Permeability measurements were conducted using a viscous fluid to permit easily observable head gradients (~2 cm over 1 m) simultaneously with low Reynolds number flow. For a second permeability measurement, Lattice Boltzmann Method flow simulations were computed on the 3D core renderings. Agreement between the two estimates indicates an accurate permeability was obtained that can be applied to future studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Adaptability and invisibility are hallmarks of modern terrorism, and keeping pace with its dynamic nature presents a serious challenge for societies throughout the world. Innovations in computer science have incorporated applied mathematics to develop a wide array of predictive models to support the variety of approaches to counterterrorism. Predictive models are usually designed to forecast the location of attacks. Although this may protect individual structures or locations, it does not reduce the threat—it merely changes the target. While predictive models dedicated to events or social relationships receive much attention where the mathematical and social science communities intersect, models dedicated to terrorist locations such as safe-houses (rather than their targets or training sites) are rare and possibly nonexistent. At the time of this research, there were no publically available models designed to predict locations where violent extremists are likely to reside. This research uses France as a case study to present a complex systems model that incorporates multiple quantitative, qualitative and geospatial variables that differ in terms of scale, weight, and type. Though many of these variables are recognized by specialists in security studies, there remains controversy with respect to their relative importance, degree of interaction, and interdependence. Additionally, some of the variables proposed in this research are not generally recognized as drivers, yet they warrant examination based on their potential role within a complex system. This research tested multiple regression models and determined that geographically-weighted regression analysis produced the most accurate result to accommodate non-stationary coefficient behavior, demonstrating that geographic variables are critical to understanding and predicting the phenomenon of terrorism. This dissertation presents a flexible prototypical model that can be refined and applied to other regions to inform stakeholders such as policy-makers and law enforcement in their efforts to improve national security and enhance quality-of-life.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Chihuahua desert is one of the most biologically diverse ecosystems in the world, but suffers serious degradation because of changes in fire regimes resulting in large catastrophic fires. My study was conducted in the Sierra La Mojonera (SLM) natural protected area in Mexico. The purpose of this study was to implement the use of FARSITE fire modeling as a fire management tool to develop an integrated fire management plan at SLM. Firebreaks proved to detain 100% of wildfire outbreaks. The rosetophilous scrub experienced the fastest rate of fire spread and lowland creosote bush scrub experienced the slowest rate of fire spread. March experienced the fastest rate of fire spread, while September experienced the slowest rate of fire spread. The results of my study provide a tool for wildfire management through the use geospatial technologies and, in particular, FARSITE fire modeling in SLM and Mexico.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Concurrent software executes multiple threads or processes to achieve high performance. However, concurrency results in a huge number of different system behaviors that are difficult to test and verify. The aim of this dissertation is to develop new methods and tools for modeling and analyzing concurrent software systems at design and code levels. This dissertation consists of several related results. First, a formal model of Mondex, an electronic purse system, is built using Petri nets from user requirements, which is formally verified using model checking. Second, Petri nets models are automatically mined from the event traces generated from scientific workflows. Third, partial order models are automatically extracted from some instrumented concurrent program execution, and potential atomicity violation bugs are automatically verified based on the partial order models using model checking. Our formal specification and verification of Mondex have contributed to the world wide effort in developing a verified software repository. Our method to mine Petri net models automatically from provenance offers a new approach to build scientific workflows. Our dynamic prediction tool, named McPatom, can predict several known bugs in real world systems including one that evades several other existing tools. McPatom is efficient and scalable as it takes advantage of the nature of atomicity violations and considers only a pair of threads and accesses to a single shared variable at one time. However, predictive tools need to consider the tradeoffs between precision and coverage. Based on McPatom, this dissertation presents two methods for improving the coverage and precision of atomicity violation predictions: 1) a post-prediction analysis method to increase coverage while ensuring precision; 2) a follow-up replaying method to further increase coverage. Both methods are implemented in a completely automatic tool.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Every space launch increases the overall amount of space debris. Satellites have limited awareness of nearby objects that might pose a collision hazard. Astrometric, radiometric, and thermal models for the study of space debris in low-Earth orbit have been developed. This modeled approach proposes analysis methods that provide increased Local Area Awareness for satellites in low-Earth and geostationary orbit. Local Area Awareness is defined as the ability to detect, characterize, and extract useful information regarding resident space objects as they move through the space environment surrounding a spacecraft. The study of space debris is of critical importance to all space-faring nations. Characterization efforts are proposed using long-wave infrared sensors for space-based observations of debris objects in low-Earth orbit. Long-wave infrared sensors are commercially available and do not require solar illumination to be observed, as their received signal is temperature dependent. The characterization of debris objects through means of passive imaging techniques allows for further studies into the origination, specifications, and future trajectory of debris objects. Conclusions are made regarding the aforementioned thermal analysis as a function of debris orbit, geometry, orientation with respect to time, and material properties. Development of a thermal model permits the characterization of debris objects based upon their received long-wave infrared signals. Information regarding the material type, size, and tumble-rate of the observed debris objects are extracted. This investigation proposes the utilization of long-wave infrared radiometric models of typical debris to develop techniques for the detection and characterization of debris objects via signal analysis of unresolved imagery. Knowledge regarding the orbital type and semi-major axis of the observed debris object are extracted via astrometric analysis. This knowledge may aid in the constraint of the admissible region for the initial orbit determination process. The resultant orbital information is then fused with the radiometric characterization analysis enabling further characterization efforts of the observed debris object. This fused analysis, yielding orbital, material, and thermal properties, significantly increases a satellite’s Local Area Awareness via an intimate understanding of the debris environment surrounding the spacecraft.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Plant-soil interaction is central to human food production and ecosystem function. Thus, it is essential to not only understand, but also to develop predictive mathematical models which can be used to assess how climate and soil management practices will affect these interactions. Scope In this paper we review the current developments in structural and chemical imaging of rhizosphere processes within the context of multiscale mathematical image based modeling. We outline areas that need more research and areas which would benefit from more detailed understanding. Conclusions We conclude that the combination of structural and chemical imaging with modeling is an incredibly powerful tool which is fundamental for understanding how plant roots interact with soil. We emphasize the need for more researchers to be attracted to this area that is so fertile for future discoveries. Finally, model building must go hand in hand with experiments. In particular, there is a real need to integrate rhizosphere structural and chemical imaging with modeling for better understanding of the rhizosphere processes leading to models which explicitly account for pore scale processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sweet potato is an important strategic agricultural crop grown in many countries around the world. The roots and aerial vine components of the crop are used for both human consumption and, to some extent as a cheap source of animal feed. In spite of its economic value and growing contribution to health and nutrition, harvested sweet potato roots and aerial vine components has limited shelf-life and is easily susceptible to post-harvest losses. Although post-harvest losses of both sweet potato roots and aerial vine components is significant, there is no information available that will support the design and development of appropriate storage and preservation systems. In this context, the present study was initiated to improve scientific knowledge about sweet potato post-harvest handling. Additionally, the study also seeks to develop a PV ventilated mud storehouse for storage of sweet potato roots under tropical conditions. In study one, airflow resistance of sweet potato aerial vine components was investigated. The influence of different operating parameters such as airflow rate, moisture content and bulk depth at different levels on airflow resistance was analyzed. All the operating parameters were observed to have significant (P < 0.01) effect on airflow resistance. Prediction models were developed and were found to adequately describe the experimental pressure drop data. In study two, the resistance of airflow through unwashed and clean sweet potato roots was investigated. The effect of sweet potato roots shape factor, surface roughness, orientation to airflow, and presence of soil fraction on airflow resistance was also assessed. The pressure drop through unwashed and clean sweet potato roots was observed to increase with higher airflow, bed depth, root grade composition, and presence of soil fraction. The physical properties of the roots were incorporated into a modified Ergun model and compared with a modified Shedd’s model. The modified Ergun model provided the best fit to the experimental data when compared with the modified Shedd’s model. In study three, the effect of sweet potato root size (medium and large), different air velocity and temperature on the cooling/or heating rate and time of individual sweet potato roots were investigated. Also, a simulation model which is based on the fundamental solution of the transient equations was proposed for estimating the cooling and heating time at the centre of sweet potato roots. The results showed that increasing air velocity during cooling and heating significantly (P < 0.05) affects the cooling and heating times. Furthermore, the cooling and heating times were significantly different (P < 0.05) among medium and large size sweet potato roots. Comparison of the simulation results with experimental data confirmed that the transient simulation model can be used to accurately estimate the cooling and heating times of whole sweet potato roots under forced convection conditions. In study four, the performance of charcoal evaporative cooling pad configurations for integration into sweet potato roots storage systems was investigated. The experiments were carried out at different levels of air velocity, water flow rates, and three pad configurations: single layer pad (SLP), double layers pad (DLP) and triple layers pad (TLP) made out of small and large size charcoal particles. The results showed that higher air velocity has tremendous effect on pressure drop. Increasing the water flow rate above the range tested had no practical benefits in terms of cooling. It was observed that DLP and TLD configurations with larger wet surface area for both types of pads provided high cooling efficiencies. In study five, CFD technique in the ANSYS Fluent software was used to simulate airflow distribution in a low-cost mud storehouse. By theoretically investigating different geometries of air inlet, plenum chamber, and outlet as well as its placement using ANSYS Fluent software, an acceptable geometry with uniform air distribution was selected and constructed. Experimental measurements validated the selected design. In study six, the performance of the developed PV ventilated system was investigated. Field measurements showed satisfactory results of the directly coupled PV ventilated system. Furthermore, the option of integrating a low-cost evaporative cooling system into the mud storage structure was also investigated. The results showed a reduction of ambient temperature inside the mud storehouse while relative humidity was enhanced. The ability of the developed storage system to provide and maintain airflow, temperature and relative humidity which are the key parameters for shelf-life extension of sweet potato roots highlight its ability to reduce post-harvest losses at the farmer level, particularly under tropical climate conditions.