49 resultados para DIGITAL ELEVATION MODELS
em Digital Commons at Florida International University
Resumo:
Recent advances in airborne Light Detection and Ranging (LIDAR) technology allow rapid and inexpensive measurements of topography over large areas. Airborne LIDAR systems usually return a 3-dimensional cloud of point measurements from reflective objects scanned by the laser beneath the flight path. This technology is becoming a primary method for extracting information of different kinds of geometrical objects, such as high-resolution digital terrain models (DTMs), buildings and trees, etc. In the past decade, LIDAR gets more and more interest from researchers in the field of remote sensing and GIS. Compared to the traditional data sources, such as aerial photography and satellite images, LIDAR measurements are not influenced by sun shadow and relief displacement. However, voluminous data pose a new challenge for automated extraction the geometrical information from LIDAR measurements because many raster image processing techniques cannot be directly applied to irregularly spaced LIDAR points. ^ In this dissertation, a framework is proposed to filter out information about different kinds of geometrical objects, such as terrain and buildings from LIDAR automatically. They are essential to numerous applications such as flood modeling, landslide prediction and hurricane animation. The framework consists of several intuitive algorithms. Firstly, a progressive morphological filter was developed to detect non-ground LIDAR measurements. By gradually increasing the window size and elevation difference threshold of the filter, the measurements of vehicles, vegetation, and buildings are removed, while ground data are preserved. Then, building measurements are identified from no-ground measurements using a region growing algorithm based on the plane-fitting technique. Raw footprints for segmented building measurements are derived by connecting boundary points and are further simplified and adjusted by several proposed operations to remove noise, which is caused by irregularly spaced LIDAR measurements. To reconstruct 3D building models, the raw 2D topology of each building is first extracted and then further adjusted. Since the adjusting operations for simple building models do not work well on 2D topology, 2D snake algorithm is proposed to adjust 2D topology. The 2D snake algorithm consists of newly defined energy functions for topology adjusting and a linear algorithm to find the minimal energy value of 2D snake problems. Data sets from urbanized areas including large institutional, commercial, and small residential buildings were employed to test the proposed framework. The results demonstrated that the proposed framework achieves a very good performance. ^
Resumo:
We produced a landscape scale map of mean tree height in mangrove forests in Everglades National Park (ENP) using the elevation data from the Shuttle Radar Topography Mission (SRTM). The SRTM data was calibrated using airborne lidar data and a high resolution USGS digital elevation model (DEM). The resulting mangrove height map has a mean tree height error of 2.0 m (RMSE) over a pixel of 30 m. In addition, we used field data to derive a relationship between mean forest stand height and biomass in order to map the spatial distribution of standing biomass of mangroves for the entire National Park. The estimation showed that most of the mangrove standing biomass in the ENP resides in intermediate- height mangrove stands around 8 m. We estimated the total mangrove standing biomass in ENP to be 5.6 X 109 kg.
Resumo:
Airborne Light Detection and Ranging (LIDAR) technology has become the primary method to derive high-resolution Digital Terrain Models (DTMs), which are essential for studying Earth's surface processes, such as flooding and landslides. The critical step in generating a DTM is to separate ground and non-ground measurements in a voluminous point LIDAR dataset, using a filter, because the DTM is created by interpolating ground points. As one of widely used filtering methods, the progressive morphological (PM) filter has the advantages of classifying the LIDAR data at the point level, a linear computational complexity, and preserving the geometric shapes of terrain features. The filter works well in an urban setting with a gentle slope and a mixture of vegetation and buildings. However, the PM filter often removes ground measurements incorrectly at the topographic high area, along with large sizes of non-ground objects, because it uses a constant threshold slope, resulting in "cut-off" errors. A novel cluster analysis method was developed in this study and incorporated into the PM filter to prevent the removal of the ground measurements at topographic highs. Furthermore, to obtain the optimal filtering results for an area with undulating terrain, a trend analysis method was developed to adaptively estimate the slope-related thresholds of the PM filter based on changes of topographic slopes and the characteristics of non-terrain objects. The comparison of the PM and generalized adaptive PM (GAPM) filters for selected study areas indicates that the GAPM filter preserves the most "cut-off" points removed incorrectly by the PM filter. The application of the GAPM filter to seven ISPRS benchmark datasets shows that the GAPM filter reduces the filtering error by 20% on average, compared with the method used by the popular commercial software TerraScan. The combination of the cluster method, adaptive trend analysis, and the PM filter allows users without much experience in processing LIDAR data to effectively and efficiently identify ground measurements for the complex terrains in a large LIDAR data set. The GAPM filter is highly automatic and requires little human input. Therefore, it can significantly reduce the effort of manually processing voluminous LIDAR measurements.
Resumo:
The major activities in Year 3 on ‘Effect of hydrologic restoration on the habitat of the Cape Sable seaside sparrow (CSSS)’ included presentations, field work, data analysis, and report preparation. During this period, we made 4 presentations, two at the CSSS – fire planning workshops at Everglades National Park (ENP), one at the Society of Wetland Scientists’ meeting in Charleston, SC, and a fourth at the Marl Prairie/CSSS performance measure workshop at ENP. We started field work in the third week of January and continued till June 3, 2005. Early in the field season, we completed vegetation surveys along two transects, B and C (~15.1 km). During April and May, vegetation sampling was completed at 199 census sites, bringing to 608 the total number of CSSS census sites with quantitative vegetation data. We updated data sets from all three years, 2003-05, and analyzed them using cluster analysis and ordination as in previous two years. However, instead of weighted averaging, we used weighted-averaging partial least square regression (WA-PLS) model, as this method is considered an improvement over WA for inferring values of environmental variables from biological species composition. We also validated the predictive power of the WA-PLS regression model by applying it to a sub-set of 100 census sites for which hydroperiods were “known” from two sources, i.e., from elevations calculated from concurrent water depth measurements onsite and at nearby water level recorders, and from USGS digital elevation data. Additionally, we collected biomass samples at 88 census sites, and determined live and dead aboveground plant biomass. Using vegetation structure and biomass data from those sites, we developed a regression model that we used to predict aboveground biomass at all transects and census sites. Finally, biomass data was analyzed in relation to hydroperiod and fire frequency.
Resumo:
Insect biodiversity is unevenly distributed on local, regional, and global scales. Elevation is a key factor in the uneven distribution of insect diversity, serving as a proxy for a host of environmental variables. My study examines the relationship of Heteroptera (true bugs) species diversity, abundance, and morphology to elevational gradients and land-use regimes on Mt. Kilimanjaro, Tanzania, East Africa. Heteroptera specimens were collected from 60 research sites covering an elevational range of 3684m (866-4550m above sea level). Thirty of the sites were classified as natural, while the remaining 30 were classified as disturbed (e.g., agricultural use or converted to grasslands). I measured aspects of the body size of adult specimens and recorded their location of origin. I used regression models to analyze the relationships of Heteroptera species richness, abundance, and body measurements to elevation and land-use regime. Richness and abundance declined with greater elevation, controlling for land use. The declines were linear or logarithmic in form, depending on the model. Richness and abundance were greater in natural than disturbed sites, controlling for elevation. According to an interaction, richness decreased more in natural than disturbed sites with rising elevation. Body length increased as a quadratic function of elevation, adjusting for land use. Body width X length decreased as a logarithmic function of elevation, while leg length/body length decreased as a quadratic function. Leg length/body length was greater in disturbed than natural sites. Interactions indicated that body length and body width X length were greater in natural than disturbed sites as elevation rose, although the general trend was downward. Future research should examine the relative importance of land area, temperature, and resource constraints for Heteroptera diversity and morphology on Mt. Kilimanjaro.
Resumo:
In this study, discrete time one-factor models of the term structure of interest rates and their application to the pricing of interest rate contingent claims are examined theoretically and empirically. The first chapter provides a discussion of the issues involved in the pricing of interest rate contingent claims and a description of the Ho and Lee (1986), Maloney and Byrne (1989), and Black, Derman, and Toy (1990) discrete time models. In the second chapter, a general discrete time model of the term structure from which the Ho and Lee, Maloney and Byrne, and Black, Derman, and Toy models can all be obtained is presented. The general model also provides for the specification of an additional model, the ExtendedMB model. The third chapter illustrates the application of the discrete time models to the pricing of a variety of interest rate contingent claims. In the final chapter, the performance of the Ho and Lee, Black, Derman, and Toy, and ExtendedMB models in the pricing of Eurodollar futures options is investigated empirically. The results indicate that the Black, Derman, and Toy and ExtendedMB models outperform the Ho and Lee model. Little difference in the performance of the Black, Derman, and Toy and ExtendedMB models is detected. ^
Resumo:
This study evaluated the relative fit of both Finn's (1989) Participation-Identification and Wehlage, Rutter, Smith, Lesko and Fernandez's (1989) School Membership models of high school completion to a sample of 4,597 eighth graders taken from the National Educational Longitudinal Study of 1988, (NELS:88), utilizing structural equation modeling techniques. This study found support for the importance of educational engagement as a factor in understanding academic achievement. The Participation-Identification model was particularly well fitting when applied to the sample of high school completers, dropouts (both overall and White dropouts) and African-American students. This study also confirmed the contribution of school environmental factors (i.e., size, diversity of economic and ethnic status among students) and family resources (i.e., availability of learning resources in the home and parent educational level) to students' educational engagement. Based on these findings, school social workers will need to be more attentive to utilizing macro-level interventions (i.e., community organization, interagency coordination) to achieve the organizational restructuring needed to address future challenges. The support found for the Participation-Identification model supports a shift in school social workers' attention from reactive attempts to improve the affective-interpersonal lives of students to proactive attention to their academic lives. The model concentrates school social work practices on the central mission of schools, which is educational engagement. School social workers guided by this model would be encouraged to seek changes in school policies and organization that would facilitate educational engagement. ^
Resumo:
Highways are generally designed to serve a mixed traffic flow that consists of passenger cars, trucks, buses, recreational vehicles, etc. The fact that the impacts of these different vehicle types are not uniform creates problems in highway operations and safety. A common approach to reducing the impacts of truck traffic on freeways has been to restrict trucks to certain lane(s) to minimize the interaction between trucks and other vehicles and to compensate for their differences in operational characteristics. ^ The performance of different truck lane restriction alternatives differs under different traffic and geometric conditions. Thus, a good estimate of the operational performance of different truck lane restriction alternatives under prevailing conditions is needed to help make informed decisions on truck lane restriction alternatives. This study develops operational performance models that can be applied to help identify the most operationally efficient truck lane restriction alternative on a freeway under prevailing conditions. The operational performance measures examined in this study include average speed, throughput, speed difference, and lane changes. Prevailing conditions include number of lanes, interchange density, free-flow speeds, volumes, truck percentages, and ramp volumes. ^ Recognizing the difficulty of collecting sufficient data for an empirical modeling procedure that involves a high number of variables, the simulation approach was used to estimate the performance values for various truck lane restriction alternatives under various scenarios. Both the CORSIM and VISSIM simulation models were examined for their ability to model truck lane restrictions. Due to a major problem found in the CORSIM model for truck lane modeling, the VISSIM model was adopted as the simulator for this study. ^ The VISSIM model was calibrated mainly to replicate the capacity given in the 2000 Highway Capacity Manual (HCM) for various free-flow speeds under the ideal basic freeway section conditions. Non-linear regression models for average speed, throughput, average number of lane changes, and speed difference between the lane groups were developed. Based on the performance models developed, a simple decision procedure was recommended to select the desired truck lane restriction alternative for prevailing conditions. ^
Resumo:
The rate of fatal crashes in Florida has remained significantly higher than the national average for the last several years. The 2003 statistics from the National Highway Traffic Safety Administration (NHTSA), the latest available, show a fatality rate in Florida of 1.71 per 100 million vehicle-miles traveled compared to the national average of 1.48 per 100 million vehicle-miles traveled. The objective of this research is to better understand the driver, environmental, and roadway factors that affect the probability of injury severity in Florida. ^ In this research, the ordered logit model was used to develop six injury severity models; single-vehicle and two-vehicle crashes on urban freeways and urban principal arterials and two-vehicle crashes at urban signalized and unsignalized intersections. The data used in this research included all crashes that occurred on the state highway system for the period from 2001 to 2003 in the Southeast Florida region, which includes the Miami-Dade, Broward and Palm Beach Counties.^ The results of the analysis indicate that the age group and gender of the driver at fault were significant factors of injury severity risk across all models. The greatest risk of severe injury was observed for the age groups 55 to 65 and 66 and older. A positive association between injury severity and the race of the driver at fault was also found. Driver at fault of Hispanic origin was associated with a higher risk of severe injury for both freeway models and for the two-vehicle crash model on arterial roads. A higher risk of more severe injury crash involvement was also found when an African-American was the at fault driver on two-vehicle crashes on freeways. In addition, the arterial class was also found to be positively associated with a higher risk of severe crashes. Six-lane divided arterials exhibited the highest injury severity risk of all arterial classes. The lowest severe injury risk was found for one way roads. Alcohol involvement by the driver at fault was also found to be a significant risk of severe injury for the single-vehicle crash model on freeways. ^
Resumo:
This dissertation presents dynamic flow experiments with fluorescently labeled platelets to allow for spatial observation of wall attachment in inter-strut spacings, to investigate their relationship to flow patterns. Human blood with fluorescently labeled platelets was circulated through an in vitro system that produced physiologic pulsatile flow in (1) a parallel plate blow chamber that contained two-dimensional (2D) stents that feature completely recirculating flow, partially recirculating flow, and completely reattached flow, and (2) a three-dimensional (3D) cylindrical tube that contained stents of various geometric designs. ^ Flow detachment and reattachment points exhibited very low platelet deposition. Platelet deposition was very low in the recirculation regions in the 3D stents unlike the 2D stents. Deposition distal to a strut was always high in 2D and 3D stents. Spirally recirculating regions were found in 3D unlike in 2D stents, where the deposition was higher than at well-separated regions of recirculation. ^
Resumo:
Liquidity is an important attribute of an asset that investors would like to take into consideration when making investment decisions. However, the previous empirical evidence whether liquidity is a determinant of stock return is not unanimous. This dissertation provides a very comprehensive study about the role of liquidity in asset pricing using the Fama-French (1993) three-factor and Kraus and Litzenberger (1976) three-moment CAPM as models for risk adjustment. The relationship between liquidity and well-known determinants of stock returns such as size and book-to-market are also investigated. This study examines the liquidity and asset pricing issues for both intertemporal as well as cross-sectional data. ^ The results indicate an existence of a liquidity premium, i.e., less liquid stocks would demand higher rate of return than more liquid stocks. More specifically, a drop of 1 percent in liquidity is associated with a higher rate of return of about 2 to 3 basis points per month. Further investigation reveals that neither the Fama-French three-factor model nor the three-moment CAPM captures the liquidity premium. Finally, the results show that well-known determinants of stock return such as size and book-to-market do not serve as proxy for liquidity. ^ Overall, this dissertation shows that a liquidity premium exists in the stock market and that liquidity is a distinct effect, and is not influenced by the presence of non-market factors, market factors and other stock characteristics.^
Resumo:
With the rapid globalization and integration of world capital markets, more and more stocks are listed in multiple markets. With multi-listed stocks, the traditional measurement of systematic risk, the domestic beta, is not appropriate since it only contain information from one market. ^ Prakash et al. (1993) developed a technique, the global beta, to capture information from multiple markets wherein the stocks are listed. In this study, the global betas are obtained as well as domestic betas for 704 multi-listed stocks from 59 world equity markets. Welch tests show that domestic betas are not equal across markets, therefore, global beta is more appropriate in a global investment setting. ^ The traditional Capital Asset Pricing Models (CAPM) is also tested with regards to both domestic beta and global beta. The results generally support the positive relationship between stocks returns and global beta while tend to reject this relationship between stocks returns and domestic beta. Further tests of International CAPM with domestic beta and global beta strengthen the conclusion.^
Resumo:
The nation's freeway systems are becoming increasingly congested. A major contribution to traffic congestion on freeways is due to traffic incidents. Traffic incidents are non-recurring events such as accidents or stranded vehicles that cause a temporary roadway capacity reduction, and they can account for as much as 60 percent of all traffic congestion on freeways. One major freeway incident management strategy involves diverting traffic to avoid incident locations by relaying timely information through Intelligent Transportation Systems (ITS) devices such as dynamic message signs or real-time traveler information systems. The decision to divert traffic depends foremost on the expected duration of an incident, which is difficult to predict. In addition, the duration of an incident is affected by many contributing factors. Determining and understanding these factors can help the process of identifying and developing better strategies to reduce incident durations and alleviate traffic congestion. A number of research studies have attempted to develop models to predict incident durations, yet with limited success. ^ This dissertation research attempts to improve on this previous effort by applying data mining techniques to a comprehensive incident database maintained by the District 4 ITS Office of the Florida Department of Transportation (FDOT). Two categories of incident duration prediction models were developed: "offline" models designed for use in the performance evaluation of incident management programs, and "online" models for real-time prediction of incident duration to aid in the decision making of traffic diversion in the event of an ongoing incident. Multiple data mining analysis techniques were applied and evaluated in the research. The multiple linear regression analysis and decision tree based method were applied to develop the offline models, and the rule-based method and a tree algorithm called M5P were used to develop the online models. ^ The results show that the models in general can achieve high prediction accuracy within acceptable time intervals of the actual durations. The research also identifies some new contributing factors that have not been examined in past studies. As part of the research effort, software code was developed to implement the models in the existing software system of District 4 FDOT for actual applications. ^
Resumo:
A high frequency physical phase variable electric machine model was developed using FE analysis. The model was implemented in a machine drive environment with hardware-in-the-loop. The novelty of the proposed model is that it is derived based on the actual geometrical and other physical information of the motor, considering each individual turn in the winding. This is the first attempt to develop such a model to obtain high frequency machine parameters without resorting to expensive experimental procedures currently in use. The model was used in a dynamic simulation environment to predict inverter-motor interaction. This includes motor terminal overvoltage, current spikes, as well as switching effects. In addition, a complete drive model was developed for electromagnetic interference (EMI) analysis and evaluation. This consists of the lumped parameter models of different system components, such as cable, inverter, and motor. The lumped parameter models enable faster simulations. The results obtained were verified by experimental measurements and excellent agreements were obtained. A change in the winding arrangement and its influence on the motor high frequency behavior has also been investigated. This was shown to have a little effect on the parameter values and in the motor high frequency behavior for equal number of turns. An accurate prediction of overvoltage and EMI in the design stages of the drive system would reduce the time required for the design modifications as well as for the evaluation of EMC compliance issues. The model can be utilized in the design optimization and insulation selection for motors. Use of this procedure could prove economical, as it would help designers develop and test new motor designs for the evaluation of operational impacts in various motor drive applications.
Resumo:
This research pursued the conceptualization, implementation, and verification of a system that enhances digital information displayed on an LCD panel to users with visual refractive errors. The target user groups for this system are individuals who have moderate to severe visual aberrations for which conventional means of compensation, such as glasses or contact lenses, does not improve their vision. This research is based on a priori knowledge of the user's visual aberration, as measured by a wavefront analyzer. With this information it is possible to generate images that, when displayed to this user, will counteract his/her visual aberration. The method described in this dissertation advances the development of techniques for providing such compensation by integrating spatial information in the image as a means to eliminate some of the shortcomings inherent in using display devices such as monitors or LCD panels. Additionally, physiological considerations are discussed and integrated into the method for providing said compensation. In order to provide a realistic sense of the performance of the methods described, they were tested by mathematical simulation in software, as well as by using a single-lens high resolution CCD camera that models an aberrated eye, and finally with human subjects having various forms of visual aberrations. Experiments were conducted on these systems and the data collected from these experiments was evaluated using statistical analysis. The experimental results revealed that the pre-compensation method resulted in a statistically significant improvement in vision for all of the systems. Although significant, the improvement was not as large as expected for the human subject tests. Further analysis suggest that even under the controlled conditions employed for testing with human subjects, the characterization of the eye may be changing. This would require real-time monitoring of relevant variables (e.g. pupil diameter) and continuous adjustment in the pre-compensation process to yield maximum viewing enhancement.