7 resultados para optical transfer function

em Digital Commons at Florida International University


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Florida Bay is a highly dynamic estuary that exhibits wide natural fluctuations in salinity due to changes in the balance of precipitation, evaporation and freshwater runoff from the mainland. Rapid and large-scale modification of freshwater flow and construction of transportation conduits throughout the Florida Keys during the late nineteenth and twentieth centuries reshaped water circulation and salinity patterns across the ecosystem. In order to determine long-term patterns in salinity variation across the Florida Bay estuary, we used a diatom-based salinity transfer function to infer salinity within 3.27 ppt root mean square error of prediction from diatom assemblages from four ~130 year old sediment records. Sites were distributed along a gradient of exposure to anthropogenic shifts in the watershed and salinity. Precipitation was found to be the primary driver influencing salinity fluctuations over the entire record, but watershed modifications on the mainland and in the Florida Keys during the late-1800s and 1900s were the most likely cause of significant shifts in baseline salinity. The timing of these shifts in the salinity baseline varies across the Bay: that of the northeastern coring location coincides with the construction of the Florida Overseas Railway (AD 1906–1916), while that of the east-central coring location coincides with the drainage of Lake Okeechobee (AD 1881–1894). Subsequent decreases occurring after the 1960s (east-central region) and early 1980s (southwestern region) correspond to increases in freshwater delivered through water control structures in the 1950s–1970s and again in the 1980s. Concomitant increases in salinity in the northeastern and south-central regions of the Bay in the mid-1960s correspond to an extensive drought period and the occurrence of three major hurricanes, while the drop in the early 1970s could not be related to any natural event. This paper provides information about major factors influencing salinity conditions in Florida Bay in the past and quantitative estimates of the pre- and post-South Florida watershed modification salinity levels in different regions of the Bay. This information should be useful for environmental managers in setting restoration goals for the marine ecosystems in South Florida, especially for Florida Bay.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

With the progress of computer technology, computers are expected to be more intelligent in the interaction with humans, presenting information according to the user's psychological and physiological characteristics. However, computer users with visual problems may encounter difficulties on the perception of icons, menus, and other graphical information displayed on the screen, limiting the efficiency of their interaction with computers. In this dissertation, a personalized and dynamic image precompensation method was developed to improve the visual performance of the computer users with ocular aberrations. The precompensation was applied on the graphical targets before presenting them on the screen, aiming to counteract the visual blurring caused by the ocular aberration of the user's eye. A complete and systematic modeling approach to describe the retinal image formation of the computer user was presented, taking advantage of modeling tools, such as Zernike polynomials, wavefront aberration, Point Spread Function and Modulation Transfer Function. The ocular aberration of the computer user was originally measured by a wavefront aberrometer, as a reference for the precompensation model. The dynamic precompensation was generated based on the resized aberration, with the real-time pupil diameter monitored. The potential visual benefit of the dynamic precompensation method was explored through software simulation, with the aberration data from a real human subject. An "artificial eye'' experiment was conducted by simulating the human eye with a high-definition camera, providing objective evaluation to the image quality after precompensation. In addition, an empirical evaluation with 20 human participants was also designed and implemented, involving image recognition tests performed under a more realistic viewing environment of computer use. The statistical analysis results of the empirical experiment confirmed the effectiveness of the dynamic precompensation method, by showing significant improvement on the recognition accuracy. The merit and necessity of the dynamic precompensation were also substantiated by comparing it with the static precompensation. The visual benefit of the dynamic precompensation was further confirmed by the subjective assessments collected from the evaluation participants.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

With the progress of computer technology, computers are expected to be more intelligent in the interaction with humans, presenting information according to the user's psychological and physiological characteristics. However, computer users with visual problems may encounter difficulties on the perception of icons, menus, and other graphical information displayed on the screen, limiting the efficiency of their interaction with computers. In this dissertation, a personalized and dynamic image precompensation method was developed to improve the visual performance of the computer users with ocular aberrations. The precompensation was applied on the graphical targets before presenting them on the screen, aiming to counteract the visual blurring caused by the ocular aberration of the user's eye. A complete and systematic modeling approach to describe the retinal image formation of the computer user was presented, taking advantage of modeling tools, such as Zernike polynomials, wavefront aberration, Point Spread Function and Modulation Transfer Function. The ocular aberration of the computer user was originally measured by a wavefront aberrometer, as a reference for the precompensation model. The dynamic precompensation was generated based on the resized aberration, with the real-time pupil diameter monitored. The potential visual benefit of the dynamic precompensation method was explored through software simulation, with the aberration data from a real human subject. An "artificial eye'' experiment was conducted by simulating the human eye with a high-definition camera, providing objective evaluation to the image quality after precompensation. In addition, an empirical evaluation with 20 human participants was also designed and implemented, involving image recognition tests performed under a more realistic viewing environment of computer use. The statistical analysis results of the empirical experiment confirmed the effectiveness of the dynamic precompensation method, by showing significant improvement on the recognition accuracy. The merit and necessity of the dynamic precompensation were also substantiated by comparing it with the static precompensation. The visual benefit of the dynamic precompensation was further confirmed by the subjective assessments collected from the evaluation participants.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We studied the development of leaf characters in two Southeast Asian dipterocarp forest trees under different photosynthetic photon flux densities (PFD) and spectral qualities (red to far-red, R:FR). The two species, Hopea helferi and H. odorata, are taxonomically closely related but differ in their ecological requirements; H. helferi is more drought tolerant and H. odorata more shade tolerant. Seedlings were grown in replicated shadehouse treatments of differing PFD and R:FR. We measured or calculated (1) leaf and tissue thicknesses; (2) mesophyll parenchyma, air space, and lignified tissue volumes; (3) mesophyll air volumes (Vmes/Asurf) and surfaces (Ames/Asurf); (4) palisade cell length and width; (5) chlorophyll/cm2 and a/ b; (6) leaf absorption; and (7) attenuance/absorbance at 652 and 550 nm. These characters varied in response to light conditions in both taxa. Characters were predominantly affected by PFD, and R:FR slightly influenced many characters. Leaf characters of H. odorata were more plastic in response to treatment conditions. Characters were correlated with each other in a complex fashion. Variation in leaf anatomy is most likely a consequence of increasing leaf thickness in both taxa, which may increase mechanical strength and defense against herbivory in more exposed environments. Variation in leaf optical properties was most likely affected by pigment photo-bleaching in treatments of more intense PFD and was not correlated with Amax. The greater plasticity of leaf responses in H. odorata helps explain the acclimation over the range of light conditions encountered by this shade-tolerant taxon. The dense layer of scales on the leaf undersurface and other anatomical characters in H. helferi reduced gas exchange and growth in this drought-tolerant tree.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A high resolution study of the quasielastic 2 H(e, e'p)n reaction was performed in Hall A at the Thomas Jefferson Accelerator Facility in Newport News, Virginia. The measurements were performed at a central momentum transfer of : q: ∼ 2400 MeV/c, and at a central energy transfer of ω ∼ 1500 MeV, a four momentum transfer Q2 = 3.5 (GeV/c)2, covering missing momenta from 0 to 0.5 GeV/c. The majority of the measurements were performed at Φ = 180° and a small set of measurements were done at Φ = 0°. The Hall A High Resolution Spectrometers (HRS) were used to detect coincident electrons and protons, respectively. Absolute 2H(e, e'p) n cross sections were obtained as a function of the recoiling neutron scattering angle with respect to [special characters omitted]. The experimental results were compared to a Plane Wave Impulse Approximation (PWIA) model and to a calculation that includes Final State Interaction (FSI) effects. Experimental 2H(e, e'p)n cross sections were determined with an estimated systematic uncertainty of 7%. The general features of the measured cross sections are reproduced by Glauber based calculations that take the motion of the bound nucleons into account (GEA). Final State Interactions (FSI) contributions were found to depend strongly on the angle of the recoiling neutron with respect to the momentum transfer and on the missing momentum. We found a systematic deviation of the theoretical prediction of about 30%. At small &thetas; nq (&thetas;nq < 60°) the theory overpredicts the cross section while at large &thetas; nq (&thetas;nq > 80°) the theory underestimates the cross sections. We observed an enhancement of the cross section, due to FSI, of about 240%, as compared to PWIA, for a missing momentum of 0.4 GeV/c at an angle of 75°. For missing momentum of 0.5 GeV/c the enhancement of the cross section due to the same FSI effects, was about 270%. This is in agreement with GEA. Standard Glauber calculations predict this large contribution to occur at an angle of 90°. Our results show that GEA better describes the 2H(e, e'p)n reaction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today, modern System-on-a-Chip (SoC) systems have grown rapidly due to the increased processing power, while maintaining the size of the hardware circuit. The number of transistors on a chip continues to increase, but current SoC designs may not be able to exploit the potential performance, especially with energy consumption and chip area becoming two major concerns. Traditional SoC designs usually separate software and hardware. Thus, the process of improving the system performance is a complicated task for both software and hardware designers. The aim of this research is to develop hardware acceleration workflow for software applications. Thus, system performance can be improved with constraints of energy consumption and on-chip resource costs. The characteristics of software applications can be identified by using profiling tools. Hardware acceleration can have significant performance improvement for highly mathematical calculations or repeated functions. The performance of SoC systems can then be improved, if the hardware acceleration method is used to accelerate the element that incurs performance overheads. The concepts mentioned in this study can be easily applied to a variety of sophisticated software applications. The contributions of SoC-based hardware acceleration in the hardware-software co-design platform include the following: (1) Software profiling methods are applied to H.264 Coder-Decoder (CODEC) core. The hotspot function of aimed application is identified by using critical attributes such as cycles per loop, loop rounds, etc. (2) Hardware acceleration method based on Field-Programmable Gate Array (FPGA) is used to resolve system bottlenecks and improve system performance. The identified hotspot function is then converted to a hardware accelerator and mapped onto the hardware platform. Two types of hardware acceleration methods – central bus design and co-processor design, are implemented for comparison in the proposed architecture. (3) System specifications, such as performance, energy consumption, and resource costs, are measured and analyzed. The trade-off of these three factors is compared and balanced. Different hardware accelerators are implemented and evaluated based on system requirements. 4) The system verification platform is designed based on Integrated Circuit (IC) workflow. Hardware optimization techniques are used for higher performance and less resource costs. Experimental results show that the proposed hardware acceleration workflow for software applications is an efficient technique. The system can reach 2.8X performance improvements and save 31.84% energy consumption by applying the Bus-IP design. The Co-processor design can have 7.9X performance and save 75.85% energy consumption.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditional Optics has provided ways to compensate some common visual limitations (up to second order visual impairments) through spectacles or contact lenses. Recent developments in wavefront science make it possible to obtain an accurate model of the Point Spread Function (PSF) of the human eye. Through what is known as the "Wavefront Aberration Function" of the human eye, exact knowledge of the optical aberration of the human eye is possible, allowing a mathematical model of the PSF to be obtained. This model could be used to pre-compensate (inverse-filter) the images displayed on computer screens in order to counter the distortion in the user's eye. This project takes advantage of the fact that the wavefront aberration function, commonly expressed as a Zernike polynomial, can be generated from the ophthalmic prescription used to fit spectacles to a person. This allows the pre-compensation, or onscreen deblurring, to be done for various visual impairments, up to second order (commonly known as myopia, hyperopia, or astigmatism). The technique proposed towards that goal and results obtained using a lens, for which the PSF is known, that is introduced into the visual path of subjects without visual impairment will be presented. In addition to substituting the effect of spectacles or contact lenses in correcting the loworder visual limitations of the viewer, the significance of this approach is that it has the potential to address higher-order abnormalities in the eye, currently not correctable by simple means.