23 resultados para Lattice-based construction


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Geographic Information Systems (GIS) is an emerging information technology (IT) which promises to have large scale influences in how spatially distributed resources are managed. It has had applications in the management of issues as diverse as recovering from the disaster of Hurricane Andrew to aiding military operations in Desert Storm. Implementation of GIS systems is an important issue because there are high cost and time involvement in setting them up. An important component of the implementation problem is the "meaning" different groups of people who are influencing the implementation give to the technology. The research was based on the theory of (theoretical stance to the problem was based on the) "Social Construction of Knowledge" systems which assumes knowledge systems are subject to sociological analysis both in usage and in content. An interpretive research approach was adopted to inductively derive a model which explains how the "meanings" of a GIS are socially constructed. The research design entailed a comparative case analysis over two county sites which were using the same GIS for a variety of purposes. A total of 75 in-depth interviews were conducted to elicit interpretations of GIS. Results indicate that differences in how geographers and data-processors view the technology lead to different implementation patterns in the two sites.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Because some Web users will be able to design a template to visualize information from scratch, while other users need to automatically visualize information by changing some parameters, providing different levels of customization of the information is a desirable goal. Our system allows the automatic generation of visualizations given the semantics of the data, and the static or pre-specified visualization by creating an interface language. We address information visualization taking into consideration the Web, where the presentation of the retrieved information is a challenge. ^ We provide a model to narrow the gap between the user's way of expressing queries and database manipulation languages (SQL) without changing the system itself thus improving the query specification process. We develop a Web interface model that is integrated with the HTML language to create a powerful language that facilitates the construction of Web-based database reports. ^ As opposed to other papers, this model offers a new way of exploring databases focusing on providing Web connectivity to databases with minimal or no result buffering, formatting, or extra programming. We describe how to easily connect the database to the Web. In addition, we offer an enhanced way on viewing and exploring the contents of a database, allowing users to customize their views depending on the contents and the structure of the data. Current database front-ends typically attempt to display the database objects in a flat view making it difficult for users to grasp the contents and the structure of their result. Our model narrows the gap between databases and the Web. ^ The overall objective of this research is to construct a model that accesses different databases easily across the net and generates SQL, forms, and reports across all platforms without requiring the developer to code a complex application. This increases the speed of development. In addition, using only the Web browsers, the end-user can retrieve data from databases remotely to make necessary modifications and manipulations of data using the Web formatted forms and reports, independent of the platform, without having to open different applications, or learn to use anything but their Web browser. We introduce a strategic method to generate and construct SQL queries, enabling inexperienced users that are not well exposed to the SQL world to build syntactically and semantically a valid SQL query and to understand the retrieved data. The generated SQL query will be validated against the database schema to ensure harmless and efficient SQL execution. (Abstract shortened by UMI.)^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation reports an investigation of the utility of two intervention programs to facilitate identity formation by way of exploration, one designed from an explicitly self-construction point of view and the other designed from an explicitly self-discovery point of view. The self-construction program was implemented using cognitive skills and orientations derived from Berzonsky (1989), Grotevant (1987), and Kurtines (1999). The self-discovery program was implemented using affective insight development strategies derived from Csikszentmihalyi (1990), Maslow (1968), and Waterman (1990). Three sets of measures were used: (a) cognitive identity measures, (b) affective identity measures, and (c) overall identity measures. Ninety undergraduates from Florida International University completed the intervention. Participants were assigned to one of three intervention conditions (Cognitive, Affective, and Control) and were pretested and posttested on cognitive, affective, and overall identity measures. Intervention strategies were introduced and discussed in the context of specific goals and choices that participants brought to group. Intervention results were then analyzed in terms of the effectiveness of the intervention conditions in promoting their respective developmental domains. The intervention was effective in promoting identity development in comparison to the control condition, with the cognitive condition facilitating cognitive competence and the affective condition facilitating affective insight. Results are discussed in the context of the constructivist and discovery perspectives, as well as in light of the broadened view of exploration offered in this paper. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent technological developments have made it possible to design various microdevices where fluid flow and heat transfer are involved. For the proper design of such systems, the governing physics needs to be investigated. Due to the difficulty to study complex geometries in micro scales using experimental techniques, computational tools are developed to analyze and simulate flow and heat transfer in microgeometries. However, conventional numerical methods using the Navier-Stokes equations fail to predict some aspects of microflows such as nonlinear pressure distribution, increase mass flow rate, slip flow and temperature jump at the solid boundaries. This necessitates the development of new computational methods which depend on the kinetic theory that are both accurate and computationally efficient. In this study, lattice Boltzmann method (LBM) was used to investigate the flow and heat transfer in micro sized geometries. The LBM depends on the Boltzmann equation which is valid in the whole rarefaction regime that can be observed in micro flows. Results were obtained for isothermal channel flows at Knudsen numbers higher than 0.01 at different pressure ratios. LBM solutions for micro-Couette and micro-Poiseuille flow were found to be in good agreement with the analytical solutions valid in the slip flow regime (0.01 < Kn < 0.1) and direct simulation Monte Carlo solutions that are valid in the transition regime (0.1 < Kn < 10) for pressure distribution and velocity field. The isothermal LBM was further extended to simulate flows including heat transfer. The method was first validated for continuum channel flows with and without constrictions by comparing the thermal LBM results against accurate solutions obtained from analytical equations and finite element method. Finally, the capability of thermal LBM was improved by adding the effect of rarefaction and the method was used to analyze the behavior of gas flow in microchannels. The major finding of this research is that, the newly developed particle-based method described here can be used as an alternative numerical tool in order to study non-continuum effects observed in micro-electro-mechanical-systems (MEMS).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The spatial and temporal distribution of modern diatom assemblages in surface sediments, on the most dominant macrophytes, and in the water column at 96 locations in Florida Bay, Biscayne Bay and adjacent regions were examined in order to develop paleoenvironmental prediction models for this region. Analyses of these distributions revealed distinct temporal and spatial differences in assemblages among the locations. The differences among diatom assemblages living on subaquatic vegetation and sediments, and in the water column were significant. Because concentrations of salts, total phosphorus (WTP), total nitrogen (WTN) and total organic carbon (WTOC) are partly controlled by water management in this region, diatom-based models were produced to assess these variables. Discriminant function analyses showed that diatoms can also be successfully used to reconstruct changes in the abundance of diatom assemblages typical for different habitats and life habits. ^ To interpret paleoenvironmental changes, changes in salinity, WTN, WTP and WTOC were inferred from diatoms preserved in sediment cores collected along environmental gradients in Florida Bay (4 cores) and from nearshore and offshore locations in Biscayne Bay (3 cores). The reconstructions showed that water quality conditions in these estuaries have been fluctuating for thousands of years due to natural processes and sea-level changes, but almost synchronized shifts in diatom assemblages occurred in the mid-1960’s at all coring locations (except Ninemile Bank and Bob Allen Bank in Florida Bay). These alterations correspond to the major construction of numerous water management structures on the mainland. Additionally, all the coring sites (except Card Sound Bank, Biscayne Bay and Trout Cove, Florida Bay) showed decreasing salinity and fluctuations in nutrient levels in the last two decades that correspond to increased rainfall in the 1990’s and increased freshwater discharge to the bays, a result of increased freshwater deliveries to the Everglades by South Florida Water Management District in the 1980’s and 1990’s. Reconstructions of the abundance of diatom assemblages typical for different habitats and life habits revealed multiple sources of diatoms to the coring locations and that epiphytic assemblages in both bays increased in abundance since the early 1990’s. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel modeling approach is applied to karst hydrology. Long-standing problems in karst hydrology and solute transport are addressed using Lattice Boltzmann methods (LBMs). These methods contrast with other modeling approaches that have been applied to karst hydrology. The motivation of this dissertation is to develop new computational models for solving ground water hydraulics and transport problems in karst aquifers, which are widespread around the globe. This research tests the viability of the LBM as a robust alternative numerical technique for solving large-scale hydrological problems. The LB models applied in this research are briefly reviewed and there is a discussion of implementation issues. The dissertation focuses on testing the LB models. The LBM is tested for two different types of inlet boundary conditions for solute transport in finite and effectively semi-infinite domains. The LBM solutions are verified against analytical solutions. Zero-diffusion transport and Taylor dispersion in slits are also simulated and compared against analytical solutions. These results demonstrate the LBM’s flexibility as a solute transport solver. The LBM is applied to simulate solute transport and fluid flow in porous media traversed by larger conduits. A LBM-based macroscopic flow solver (Darcy’s law-based) is linked with an anisotropic dispersion solver. Spatial breakthrough curves in one and two dimensions are fitted against the available analytical solutions. This provides a steady flow model with capabilities routinely found in ground water flow and transport models (e.g., the combination of MODFLOW and MT3D). However the new LBM-based model retains the ability to solve inertial flows that are characteristic of karst aquifer conduits. Transient flows in a confined aquifer are solved using two different LBM approaches. The analogy between Fick’s second law (diffusion equation) and the transient ground water flow equation is used to solve the transient head distribution. An altered-velocity flow solver with source/sink term is applied to simulate a drawdown curve. Hydraulic parameters like transmissivity and storage coefficient are linked with LB parameters. These capabilities complete the LBM’s effective treatment of the types of processes that are simulated by standard ground water models. The LB model is verified against field data for drawdown in a confined aquifer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Prior research has established that idiosyncratic volatility of the securities prices exhibits a positive trend. This trend and other factors have made the merits of investment diversification and portfolio construction more compelling. ^ A new optimization technique, a greedy algorithm, is proposed to optimize the weights of assets in a portfolio. The main benefits of using this algorithm are to: (a) increase the efficiency of the portfolio optimization process, (b) implement large-scale optimizations, and (c) improve the resulting optimal weights. In addition, the technique utilizes a novel approach in the construction of a time-varying covariance matrix. This involves the application of a modified integrated dynamic conditional correlation GARCH (IDCC - GARCH) model to account for the dynamics of the conditional covariance matrices that are employed. ^ The stochastic aspects of the expected return of the securities are integrated into the technique through Monte Carlo simulations. Instead of representing the expected returns as deterministic values, they are assigned simulated values based on their historical measures. The time-series of the securities are fitted into a probability distribution that matches the time-series characteristics using the Anderson-Darling goodness-of-fit criterion. Simulated and actual data sets are used to further generalize the results. Employing the S&P500 securities as the base, 2000 simulated data sets are created using Monte Carlo simulation. In addition, the Russell 1000 securities are used to generate 50 sample data sets. ^ The results indicate an increase in risk-return performance. Choosing the Value-at-Risk (VaR) as the criterion and the Crystal Ball portfolio optimizer, a commercial product currently available on the market, as the comparison for benchmarking, the new greedy technique clearly outperforms others using a sample of the S&P500 and the Russell 1000 securities. The resulting improvements in performance are consistent among five securities selection methods (maximum, minimum, random, absolute minimum, and absolute maximum) and three covariance structures (unconditional, orthogonal GARCH, and integrated dynamic conditional GARCH). ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the U.S., construction accidents remain a significant economic and social problem. Despite recent improvement, the Construction industry, generally, has lagged behind other industries in implementing safety as a total management process for achieving zero accidents and developing a high-performance safety culture. One aspect of this total approach to safety that has frustrated the construction industry the most has been “measurement”, which involves identifying and quantifying the factors that critically influence safe work behaviors. The basic problem attributed is the difficulty in assessing what to measure and how to measure it—particularly the intangible aspects of safety. Without measurement, the notion of continuous improvement is hard to follow. This research was undertaken to develop a strategic framework for the measurement and continuous improvement of total safety in order to achieve and sustain the goal of zero accidents, while improving the quality, productivity and the competitiveness of the construction industry as it moves forward. The research based itself on an integral model of total safety that allowed decomposition of safety into interior and exterior characteristics using a multiattribute analysis technique. Statistical relationships between total safety dimensions and safety performance (measured by safe work behavior) were revealed through a series of latent variables (factors) that describe the total safety environment of a construction organization. A structural equation model (SEM) was estimated for the latent variables to quantify relationships among them and between these total safety determinants and safety performance of a construction organization. The developed SEM constituted a strategic framework for identifying, measuring, and continuously improving safety as a total concern for achieving and sustaining the goal of zero accidents.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bio-molecular interactions exist ubiquitously in all biological systems. This dissertation project was to construct a powerful surface plasmon resonance (SPR) sensor. The SPR system is used to study bio-molecular interactions in real time and without labeling. Surface plasmon is the oscillation of free electrons in metals coupled with surface electromagnetic waves. These surface electromagnetic waves provide a sensitive probe to study bio-molecular interactions on metal surfaces. This project resulted in the successful construction and optimization of a homemade SPR sensor and the development of several new powerful protocols to study bio-molecular interactions. It was discovered through this project that the limitations of earlier SPR sensors are related not only to the instrumentation design and operating procedures, but also to the complex behaviors of bio-molecules on sensor surfaces that were very different from that in solution. Based on these discoveries the instrumentation design and operating procedures were fully optimized. A set of existing sensor surface treatment protocols were tested and evaluated and new protocols were developed in this project. The new protocols have demonstrated excellent performance to study biomolecular interactions. The optimized home-made SPR sensor was used to study protein-surface interactions. These protein-surface interactions are responsible for many complex organic cell activities. The co-existence of different driving forces and their correlation with the structure of the protein and the surface make the understanding of the fundamental mechanism of protein-surface interactions a very challenging task. Using the improved SPR sensor, the electrostatic interaction and hydrophobic interaction were studied separately. The results of this project directly confirmed the theoretical predictions for electrostatic force between the protein and surface. In addition, this project demonstrated that the strength of the protein-surface hydrophobic interaction does not solely depend on the hydrophobicity as reported earlier. Surface structure also plays a significant role.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study examined the representation of national and religious dimensions of Iranian history and identity in Iranian middle school history textbooks. Furthermore, through a qualitative case study in a school in the capital city of Tehran, teachers' use of textbooks in classrooms, students' response, their perceptions of the country's past, and their definitions of national identity is studied. The study follows a critical discourse analysis framework by focusing on the subjectivity of the text and examining how specific concepts, in this case collective identities, are constructed through historical narratives and how social actors, in this case students, interact with , and make sense of, the process. My definition of national identity is based on the ethnosymbolism paradigm (Smith, 2003) that accommodates both pre-modern cultural roots of a nation and the development and trajectory of modern political institutions. Two qualitative approaches of discourse analysis and case study were employed. The textbooks selected were those published by the Ministry of Education; universally used in all middle schools across the country in 2009. The case study was conducted in a girls' school in Tehran. The students who participated in the study were ninth grade students who were in their first year of high school and had just finished a complete course of Iranian history in middle school. Observations were done in history classes in all three grades of the middle school. The study findings show that textbooks present a generally negative discourse of Iran's long history as being dominated by foreign invasions and incompetent kings. At the same time, the role of Islam and Muslim clergy gradually elevates in salvaging the country from its despair throughout history, becomes prominent in modern times, and finally culminates in the Islamic Revolution as the ultimate point of victory for the Iranian people. Throughout this representation, Islam becomes increasingly dominant in the textbooks' narrative of Iranian identity and by the time of the Islamic Revolution morphs into its single most prominent element. On the other hand, the students have created their own image of Iran's history and Iranian identity that diverges from that of the textbooks especially in their recollection of modern times. They have internalized the generally negative narrative of textbooks, but have not accepted the positive role of Islam and Muslim clergy. Their notion of Iranian identity is dominated by feelings of defeat and failure, anecdotal elements of pride in the very ancient history, and a sense of passivity and helplessness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Synthesis and functionalization of large-area graphene and its structural, electrical and electrochemical properties has been investigated. First, the graphene films, grown by thermal chemical vapor deposition (CVD), contain three to five atomic layers of graphene, as confirmed by Raman spectroscopy and high-resolution transmission electron microscopy. Furthermore, the graphene film is treated with CF4 reactive-ion plasma to dope fluorine ions into graphene lattice as confirmed by X-ray photoelectron spectroscopy (XPS) and UV-photoemission spectroscopy (UPS). Electrochemical characterization reveals that the catalytic activity of graphene for iodine reduction enhanced with increasing plasma treatment time, which is attributed to increase in catalytic sites of graphene for charge transfer. The fluorinated graphene is characterized as a counter-electrode (CE) in a dye-sensitized solar cell (DSSC) which shows ~ 2.56% photon to electron conversion efficiency with ~11 mAcm−2 current density. Second, the large scale graphene film is covalently functionalized with HNO3 for high efficiency electro-catalytic electrode for DSSC. The XPS and UPS confirm the covalent attachment of C-OH, C(O)OH and NO3- moieties with carbon atoms through sp2-sp3 hybridization and Fermi level shift of graphene occurs under different doping concentrations, respectively. Finally, CoS-implanted graphene (G-CoS) film was prepared using CVD followed by SILAR method. The G-CoS electro-catalytic electrodes are characterized in a DSSC CE and is found to be highly electro-catalytic towards iodine reduction with low charge transfer resistance (Rct ~5.05 Ωcm 2) and high exchange current density (J0~2.50 mAcm -2). The improved performance compared to the pristine graphene is attributed to the increased number of active catalytic sites of G-CoS and highly conducting path of graphene. We also studied the synthesis and characterization of graphene-carbon nanotube (CNT) hybrid film consisting of graphene supported by vertical CNTs on a Si substrate. The hybrid film is inverted and transferred to flexible substrates for its application in flexible electronics, demonstrating a distinguishable variation of electrical conductivity for both tension and compression. Furthermore, both turn-on field and total emission current was found to depend strongly on the bending radius of the film and were found to vary in ranges of 0.8 - 3.1 V/μm and 4.2 - 0.4 mA, respectively.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The safety of workers in nighttime roadway work zones has become a major concern for state transportation agencies due to the increase in the number of work zone fatalities. During the last decade, several studies have focused on the improvement of safety in nighttime roadway work zones; but the element that is still missing is a set of tools for translating the research results into practice. This paper discusses: 1) the importance of translating the research results related to the safety of workers and safety planning of nighttime work zones into practice, and 2) examples of tools that can be used for translating the results of such studies into practice. A tool that can propose safety recommendations in nighttime work zones and a web-based safety training tool for workers are presented in this paper. The tools were created as a component of a five-year research study on the assessment of the safety of nighttime roadway construction. The objectives of both tools are explained as well as their functionalities (i.e., what the tools can do for the users); their components (e.g., knowledge base, database, and interfaces); and their structures (i.e., how the components of the tools are organized to meet the objectives). Evaluations by the proposed users of each tool are also presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lineup procedures have recently garnered extensive empirical attention, in an effort to reduce the number of mistaken identifications that plague the criminal justice system. Relatively little attention, however, has been paid to the influence of the lineup constructor or the lineup construction technique on the quality of the lineup. This study examined whether the cross-race effect has an influence on the quality of lineups constructed using a match-to-suspect or match-to-description technique in a series of three phases. Participants generated descriptions of same- and other-race targets in Phase 1, which were used in Phase 2. In Phase 2, participants were asked to create lineups for own-race targets and other-race targets using one of two techniques. The lineups created in this phase were examined for lineup quality in Phase 3 by calculating lineup fairness assessments through the use of a mock witness paradigm. ^ Overall, the results of these experiment phases suggest that the race of those involved in the lineup construction process influences lineups. There was no difference in witness description accuracy in Phase 1, which ran counter to predictions based on the cross-race effect. The cross-race effect was observed, however, in Phases 2 and 3. The lineup construction technique used also influenced several of the process measures, selection estimates, and fairness judgments in Phase 2. Interestingly, the presence of the cross-race effect was in the opposite direction as predicted for some measures in both phases. In Phase 2, the cross-race effect was as predicted for number of foils viewed, but in the opposite direction for average time spent viewing each foil. In Phase 3, the cross-race effect was in the opposite direction than predicted, with higher levels of lineup fairness in other-race lineups. The practical implications of these findings are discussed in relation to lineup fairness within the legal system. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Non-Destructive Testing (NDT) of deep foundations has become an integral part of the industry's standard manufacturing processes. It is not unusual for the evaluation of the integrity of the concrete to include the measurement of ultrasonic wave speeds. Numerous methods have been proposed that use the propagation speed of ultrasonic waves to check the integrity of concrete for drilled shaft foundations. All such methods evaluate the integrity of the concrete inside the cage and between the access tubes. The integrity of the concrete outside the cage remains to be considered to determine the location of the border between the concrete and the soil in order to obtain the diameter of the drilled shaft. It is also economic to devise a methodology to obtain the diameter of the drilled shaft using the Cross-Hole Sonic Logging system (CSL). Performing such a methodology using the CSL and following the CSL tests is performed and used to check the integrity of the inside concrete, thus allowing the determination of the drilled shaft diameter without having to set up another NDT device.^ This proposed new method is based on the installation of galvanized tubes outside the shaft across from each inside tube, and performing the CSL test between the inside and outside tubes. From the performed experimental work a model is developed to evaluate the relationship between the thickness of concrete and the ultrasonic wave properties using signal processing. The experimental results show that there is a direct correlation between concrete thicknesses outside the cage and maximum amplitude of the received signal obtained from frequency domain data. This study demonstrates how this new method to measuring the diameter of drilled shafts during construction using a NDT method overcomes the limitations of currently-used methods. ^ In the other part of study, a new method is proposed to visualize and quantify the extent and location of the defects. It is based on a color change in the frequency amplitude of the signal recorded by the receiver probe in the location of defects and it is called Frequency Tomography Analysis (FTA). Time-domain data is transferred to frequency-domain data of the signals propagated between tubes using Fast Fourier Transform (FFT). Then, distribution of the FTA will be evaluated. This method is employed after CSL has determined the high probability of an anomaly in a given area and is applied to improve location accuracy and to further characterize the feature. The technique has a very good resolution and clarifies the exact depth location of any void or defect through the length of the drilled shaft for the voids inside the cage. ^ The last part of study also evaluates the effect of voids inside and outside the reinforcement cage and corrosion in the longitudinal bars on the strength and axial load capacity of drilled shafts. The objective is to quantify the extent of loss in axial strength and stiffness of drilled shafts due to presence of different types of symmetric voids and corrosion throughout their lengths.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation introduces a new approach for assessing the effects of pediatric epilepsy on the language connectome. Two novel data-driven network construction approaches are presented. These methods rely on connecting different brain regions using either extent or intensity of language related activations as identified by independent component analysis of fMRI data. An auditory description decision task (ADDT) paradigm was used to activate the language network for 29 patients and 30 controls recruited from three major pediatric hospitals. Empirical evaluations illustrated that pediatric epilepsy can cause, or is associated with, a network efficiency reduction. Patients showed a propensity to inefficiently employ the whole brain network to perform the ADDT language task; on the contrary, controls seemed to efficiently use smaller segregated network components to achieve the same task. To explain the causes of the decreased efficiency, graph theoretical analysis was carried out. The analysis revealed no substantial global network feature differences between the patient and control groups. It also showed that for both subject groups the language network exhibited small-world characteristics; however, the patient's extent of activation network showed a tendency towards more random networks. It was also shown that the intensity of activation network displayed ipsilateral hub reorganization on the local level. The left hemispheric hubs displayed greater centrality values for patients, whereas the right hemispheric hubs displayed greater centrality values for controls. This hub hemispheric disparity was not correlated with a right atypical language laterality found in six patients. Finally it was shown that a multi-level unsupervised clustering scheme based on self-organizing maps, a type of artificial neural network, and k-means was able to fairly and blindly separate the subjects into their respective patient or control groups. The clustering was initiated using the local nodal centrality measurements only. Compared to the extent of activation network, the intensity of activation network clustering demonstrated better precision. This outcome supports the assertion that the local centrality differences presented by the intensity of activation network can be associated with focal epilepsy.^