902 resultados para Digital techniques


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The project demonstrates the use of modern technologies for preservation and presentation of the cultural and historical heritage. The idea is a database of cultural and historical heritage sites to be created applying three dimensional laser scanning technology and a combination of geodetic and photogrammetric methods and shooting techniques. For the purposes of carrying out this project, we have focused on some heritage sites in the central part of Sofia. We decided to include these particular buildings because of the fact that there is hardly another city in the world where within a radius of 400 m are located four temples of different religions - Jewish, Muslim, Orthodox and Catholic. In the recent years, preservation of cultural heritage has been increasingly linked to objectives of sustainable development. Today, it has become clear that cultural heritage is also an economic resource that should be used for further economic development (through compulsory preservation of its authentic cultural values). There has been a more active public debate on the role of cultural heritage, regarding the following topics: improving the quality of life through development of cultural tourism, leading to an increase of the employment rate, constantly improving the business climate, etc. Cultural heritage preservation is becoming one of the priority objectives of the urban development policy. The focus has been shifted to new ways of preservation, mainly combinations of sophisticated technological solutions and their application for the purposes of preservation and dissemination of the cultural heritage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we discuss recent advances in digital signal processing techniques for compensation of the laser phase noise and fiber nonlinearity impairments in coherent optical orthogonal frequency division multiplexing (CO-OFDM) transmission. For laser phase noise compensation, we focus on quasi-pilot-aided (QPA) and decision-directed-free blind (DDF-blind) phase noise compensation techniques. For fiber nonlinearity compensation, we discuss in details the principle and performance of the phase-conjugated pilots (PCP) scheme.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Premium Intraocular Lenses (IOLs) such as toric IOLs, multifocal IOLs (MIOLs) and accommodating IOLs (AIOLs) can provide better refractive and visual outcomes compared to standard monofocal designs, leading to greater levels of post-operative spectacle independence. The principal theme of this thesis relates to the development of new assessment techniques that can help to improve future premium IOL design. IOLs designed to correct astigmatism form the focus of the first part of the thesis. A novel toric IOL design was devised to decrease the effect of toric rotation on patient visual acuity, but found to have neither a beneficial or detrimental impact on visual acuity retention. IOL tilt, like rotation, may curtail visual performance; however current IOL tilt measurement techniques require the use of specialist equipment not readily available in most ophthalmological clinics. Thus a new idea that applied Pythagoras’s theory to digital images of IOL optic symmetricality in order to calculate tilt was proposed, and shown to be both accurate and highly repeatable. A literature review revealed little information on the relationship between IOL tilt, decentration and rotation and so this was examined. A poor correlation between these factors was found, indicating they occur independently of each other. Next, presbyopia correcting IOLs were investigated. The light distribution of different MIOLs and an AIOL was assessed using perimetry, to establish whether this could be used to inform optimal IOL design. Anticipated differences in threshold sensitivity between IOLs were not however found, thus perimetry was concluded to be ineffective in mapping retinal projection of blur. The observed difference between subjective and objective measures of accommodation, arising from the influence of pseudoaccommodative factors, was explored next to establish how much additional objective power would be required to restore the eye’s focus with AIOLs. Blur tolerance was found to be the key contributor to the ocular depth of focus, with an approximate dioptric influence of 0.60D. Our understanding of MIOLs may be limited by the need for subjective defocus curves, which are lengthy and do not permit important additional measures to be undertaken. The use of aberrometry to provide faster objective defocus curves was examined. Although subjective and objective measures related well, the peaks of the MIOL defocus curve profile were not evident with objective prediction of acuity, indicating a need for further refinement of visual quality metrics based on ocular aberrations. The experiments detailed in the thesis evaluate methods to improve visual performance with toric IOLs. They also investigate new techniques to allow more rapid post-operative assessment of premium IOLs, which could allow greater insights to be obtained into several aspects of visual quality, in order to optimise future IOL design and ultimately enhance patient satisfaction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coherent optical orthogonal frequency division multiplexing (CO-OFDM) has been actively considered as a potential candidate for long-haul transmission and 400 Gb/s to 1 Tb/s Ethernet transport because of its high spectral efficiency, efficient implementation, flexibility and robustness against linear impairments such as chromatic dispersion and polarization mode dispersion. However, due to the long symbol duration and narrow subcarrier spacing, CO-OFDM systems are sensitive to laser phase noise and fibre nonlinearity induced penalties. As a result, the development of CO-OFDM transmission technology crucially relies on efficient techniques to compensate for the laser phase noise and fibre nonlinearity impairments. In this thesis, high performance and low complexity digital signal processing techniques for laser phase noise and fibre nonlinearity compensation in CO-OFDM transmissions are demonstrated. For laser phase noise compensation, three novel techniques, namely quasipilot-aided, decision-directed-free blind and multiplier-free blind are introduced. For fibre nonlinear compensation, two novel techniques which are referred to as phase conjugated pilots and phase conjugated subcarrier coding, are proposed. All these abovementioned digital signal processing techniques offer high performances and flexibilities while requiring relatively low complexities in comparison with other existing phase noise and nonlinear compensation techniques. As a result of the developments of these digital signal processing techniques, CO-OFDM technology is expected to play a significant role in future ultra-high capacity optical network. In addition, this thesis also presents preliminary study on nonlinear Fourier transform based transmission schemes in which OFDM is a highly suitable modulation format. The obtained result paves the way towards a truly flexible nonlinear wave-division multiplexing system that allows the current nonlinear transmission limitations to be exceeded.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of a new set of frost property measurement techniques to be used in the control of frost growth and defrosting processes in refrigeration systems was investigated. Holographic interferometry and infrared thermometry were used to measure the temperature of the frost-air interface, while a beam element load sensor was used to obtain the weight of a deposited frost layer. The proposed measurement techniques were tested for the cases of natural and forced convection, and the characteristic charts were obtained for a set of operational conditions. ^ An improvement of existing frost growth mathematical models was also investigated. The early stage of frost nucleation was commonly not considered in these models and instead an initial value of layer thickness and porosity was regularly assumed. A nucleation model to obtain the droplet diameter and surface porosity at the end of the early frosting period was developed. The drop-wise early condensation in a cold flat plate under natural convection to a hot (room temperature) and humid air was modeled. A nucleation rate was found, and the relation of heat to mass transfer (Lewis number) was obtained. It was found that the Lewis number was much smaller than unity, which is the standard value usually assumed for most frosting numerical models. The nucleation model was validated against available experimental data for the early nucleation and full growth stages of the frosting process. ^ The combination of frost top temperature and weight variation signals can now be used to control the defrosting timing and the developed early nucleation model can now be used to simulate the entire process of frost growth in any surface material. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this thesis was to build the Guitar Application ToolKit (GATK), a series of applications used to expand the sonic capabilities of the acoustic/electric stereo guitar. Furthermore, the goal of the GATK was to extend improvisational capabilities and the compositional techniques generated by this innovative instrument. ^ During the GATK creation process, the current production guitar techniques and overall sonic result were enhanced by planning and implementing a personalized electro-acoustic performance set up, designing custom-made performance interfaces, creating interactive compositional strategies, crafting non-standardized sounds, and controlling various music parameters in real-time using the Max/MSP programming environment. ^ This was the fast thesis project of its kind. It is expected that this thesis will be useful as a reference paper for electronic musicians and music technology students; as a product demonstration for companies that manufacture the relevant software; and as a personal portfolio for future technology related jobs. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation research project addressed the question of how hydrologic restoration of the Everglades is impacting the nutrient dynamics of marsh ecosystems in the southern Everglades. These effects were analyzed by quantifying nitrogen (N) cycle dynamics in the region. I utilized stable isotope tracer techniques to investigate nitrogen uptake and cycling between the major ecosystem components of the freshwater marsh system. I recorded the natural isotopic signatures (δ15N and δ 13C) for major ecosystem components from the three major watersheds of the Everglades: Shark River Slough, Taylor Slough, and C-111 basin. Analysis of δ15 N and δ13C natural abundance data were used to demonstrate the spatial extent to which nitrogen from anthropogenic or naturally enriched sources is entering the marshes of the Everglades. In addition, I measured the fluxes on N between various ecosystem components at both near-canal and estuarine ecotone locations. Lastly, I investigated the effect of three phosphorus load treatments (0.00 mg P m-2, 6.66 mg P m-2, and 66.6 mg P m-2) on the rate and magnitude of ecosystem N-uptake and N-cycling. The δ15N and δ13C natural abundance data supported the hypothesis that ecosystem components from near-canal sites have heavier, more enriched δ 15N isotopic signatures than downstream sites. The natural abundance data also showed that the marshes of the southern Everglades are acting as a sink for isotopically heavier, canal-borne dissolved inorganic nitrogen (DIN) and a source for "new" marsh derived dissolved organic nitrogen (DON). In addition, the 15N mesocosm data showed the rapid assimilation of the 15N tracer by the periphyton component and the delayed N uptake by soil and macrophyte components in the southern Everglades.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The nation's freeway systems are becoming increasingly congested. A major contribution to traffic congestion on freeways is due to traffic incidents. Traffic incidents are non-recurring events such as accidents or stranded vehicles that cause a temporary roadway capacity reduction, and they can account for as much as 60 percent of all traffic congestion on freeways. One major freeway incident management strategy involves diverting traffic to avoid incident locations by relaying timely information through Intelligent Transportation Systems (ITS) devices such as dynamic message signs or real-time traveler information systems. The decision to divert traffic depends foremost on the expected duration of an incident, which is difficult to predict. In addition, the duration of an incident is affected by many contributing factors. Determining and understanding these factors can help the process of identifying and developing better strategies to reduce incident durations and alleviate traffic congestion. A number of research studies have attempted to develop models to predict incident durations, yet with limited success. ^ This dissertation research attempts to improve on this previous effort by applying data mining techniques to a comprehensive incident database maintained by the District 4 ITS Office of the Florida Department of Transportation (FDOT). Two categories of incident duration prediction models were developed: "offline" models designed for use in the performance evaluation of incident management programs, and "online" models for real-time prediction of incident duration to aid in the decision making of traffic diversion in the event of an ongoing incident. Multiple data mining analysis techniques were applied and evaluated in the research. The multiple linear regression analysis and decision tree based method were applied to develop the offline models, and the rule-based method and a tree algorithm called M5P were used to develop the online models. ^ The results show that the models in general can achieve high prediction accuracy within acceptable time intervals of the actual durations. The research also identifies some new contributing factors that have not been examined in past studies. As part of the research effort, software code was developed to implement the models in the existing software system of District 4 FDOT for actual applications. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research pursued the conceptualization, implementation, and verification of a system that enhances digital information displayed on an LCD panel to users with visual refractive errors. The target user groups for this system are individuals who have moderate to severe visual aberrations for which conventional means of compensation, such as glasses or contact lenses, does not improve their vision. This research is based on a priori knowledge of the user's visual aberration, as measured by a wavefront analyzer. With this information it is possible to generate images that, when displayed to this user, will counteract his/her visual aberration. The method described in this dissertation advances the development of techniques for providing such compensation by integrating spatial information in the image as a means to eliminate some of the shortcomings inherent in using display devices such as monitors or LCD panels. Additionally, physiological considerations are discussed and integrated into the method for providing said compensation. In order to provide a realistic sense of the performance of the methods described, they were tested by mathematical simulation in software, as well as by using a single-lens high resolution CCD camera that models an aberrated eye, and finally with human subjects having various forms of visual aberrations. Experiments were conducted on these systems and the data collected from these experiments was evaluated using statistical analysis. The experimental results revealed that the pre-compensation method resulted in a statistically significant improvement in vision for all of the systems. Although significant, the improvement was not as large as expected for the human subject tests. Further analysis suggest that even under the controlled conditions employed for testing with human subjects, the characterization of the eye may be changing. This would require real-time monitoring of relevant variables (e.g. pupil diameter) and continuous adjustment in the pre-compensation process to yield maximum viewing enhancement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Distributed applications are exposed as reusable components that are dynamically discovered and integrated to create new applications. These new applications, in the form of aggregate services, are vulnerable to failure due to the autonomous and distributed nature of their integrated components. This vulnerability creates the need for adaptability in aggregate services. The need for adaptation is accentuated for complex long-running applications as is found in scientific Grid computing, where distributed computing nodes may participate to solve computation and data-intensive problems. Such applications integrate services for coordinated problem solving in areas such as Bioinformatics. For such applications, when a constituent service fails, the application fails, even though there are other nodes that can substitute for the failed service. This concern is not addressed in the specification of high-level composition languages such as that of the Business Process Execution Language (BPEL). We propose an approach to transparently autonomizing existing BPEL processes in order to make them modifiable at runtime and more resilient to the failures in their execution environment. By transparent introduction of adaptive behavior, adaptation preserves the original business logic of the aggregate service and does not tangle the code for adaptive behavior with that of the aggregate service. The major contributions of this dissertation are: first, we assessed the effectiveness of BPEL language support in developing adaptive mechanisms. As a result, we identified the strengths and limitations of BPEL and came up with strategies to address those limitations. Second, we developed a technique to enhance existing BPEL processes transparently in order to support dynamic adaptation. We proposed a framework which uses transparent shaping and generative programming to make BPEL processes adaptive. Third, we developed a technique to dynamically discover and bind to substitute services. Our technique was evaluated and the result showed that dynamic utilization of components improves the flexibility of adaptive BPEL processes. Fourth, we developed an extensible policy-based technique to specify how to handle exceptional behavior. We developed a generic component that introduces adaptive behavior for multiple BPEL processes. Fifth, we identify ways to apply our work to facilitate adaptability in composite Grid services.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Internet has become an integral part of our nation’s critical socio-economic infrastructure. With its heightened use and growing complexity however, organizations are at greater risk of cyber crimes. To aid in the investigation of crimes committed on or via the Internet, a network forensics analysis tool pulls together needed digital evidence. It provides a platform for performing deep network analysis by capturing, recording and analyzing network events to find out the source of a security attack or other information security incidents. Existing network forensics work has been mostly focused on the Internet and fixed networks. But the exponential growth and use of wireless technologies, coupled with their unprecedented characteristics, necessitates the development of new network forensic analysis tools. This dissertation fostered the emergence of a new research field in cellular and ad-hoc network forensics. It was one of the first works to identify this problem and offer fundamental techniques and tools that laid the groundwork for future research. In particular, it introduced novel methods to record network incidents and report logged incidents. For recording incidents, location is considered essential to documenting network incidents. However, in network topology spaces, location cannot be measured due to absence of a ‘distance metric’. Therefore, a novel solution was proposed to label locations of nodes within network topology spaces, and then to authenticate the identity of nodes in ad hoc environments. For reporting logged incidents, a novel technique based on Distributed Hash Tables (DHT) was adopted. Although the direct use of DHTs for reporting logged incidents would result in an uncontrollably recursive traffic, a new mechanism was introduced that overcome this recursive process. These logging and reporting techniques aided forensics over cellular and ad-hoc networks, which in turn increased their ability to track and trace attacks to their source. These techniques were a starting point for further research and development that would result in equipping future ad hoc networks with forensic components to complement existing security mechanisms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A comprehensive investigation of sensitive ecosystems in South Florida with the main goal of determining the identity, spatial distribution, and sources of both organic biocides and trace elements in different environmental compartments is reported. This study presents the development and validation of a fractionation and isolation method of twelve polar acidic herbicides commonly applied in the vicinity of the study areas, including e.g. 2,4-D, MCPA, dichlorprop, mecroprop, picloram in surface water. Solid phase extraction (SPE) was used to isolate the analytes from abiotic matrices containing large amounts of dissolved organic material. Atmospheric-pressure ionization (API) with electrospray ionization in negative mode (ESP-) in a Quadrupole Ion Trap mass spectrometer was used to perform the characterization of the herbicides of interest. ^ The application of Laser Ablation-ICP-MS methodology in the analysis of soils and sediments is reported in this study. The analytical performance of the method was evaluated on certified standards and real soil and sediment samples. Residential soils were analyzed to evaluate feasibility of using the powerful technique as a routine and rapid method to monitor potential contaminated sites. Forty eight sediments were also collected from semi pristine areas in South Florida to conduct screening of baseline levels of bioavailable elements in support of risk evaluation. The LA-ICP-MS data were used to perform a statistical evaluation of the elemental composition as a tool for environmental forensics. ^ A LA-ICP-MS protocol was also developed and optimized for the elemental analysis of a wide range of elements in polymeric filters containing atmospheric dust. A quantitative strategy based on internal and external standards allowed for a rapid determination of airborne trace elements in filters containing both contemporary African dust and local dust emissions. These distributions were used to qualitative and quantitative assess differences of composition and to establish provenance and fluxes to protected regional ecosystems such as coral reefs and national parks. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently, wireless network technology has grown at such a pace that scientific research has become a practical reality in a very short time span. Mobile wireless communications have witnessed the adoption of several generations, each of them complementing and improving the former. One mobile system that features high data rates and open network architecture is 4G. Currently, the research community and industry, in the field of wireless networks, are working on possible choices for solutions in the 4G system. 4G is a collection of technologies and standards that will allow a range of ubiquitous computing and wireless communication architectures. The researcher considers one of the most important characteristics of future 4G mobile systems the ability to guarantee reliable communications from 100 Mbps, in high mobility links, to as high as 1 Gbps for low mobility users, in addition to high efficiency in the spectrum usage. On mobile wireless communications networks, one important factor is the coverage of large geographical areas. In 4G systems, a hybrid satellite/terrestrial network is crucial to providing users with coverage wherever needed. Subscribers thus require a reliable satellite link to access their services when they are in remote locations, where a terrestrial infrastructure is unavailable. Thus, they must rely upon satellite coverage. Good modulation and access technique are also required in order to transmit high data rates over satellite links to mobile users. This technique must adapt to the characteristics of the satellite channel and also be efficient in the use of allocated bandwidth. Satellite links are fading channels, when used by mobile users. Some measures designed to approach these fading environments make use of: (1) spatial diversity (two receive antenna configuration); (2) time diversity (channel interleaver/spreading techniques); and (3) upper layer FEC. The author proposes the use of OFDM (Orthogonal Frequency Multiple Access) for the satellite link by increasing the time diversity. This technique will allow for an increase of the data rate, as primarily required by multimedia applications, and will also optimally use the available bandwidth. In addition, this dissertation approaches the use of Cooperative Satellite Communications for hybrid satellite/terrestrial networks. By using this technique, the satellite coverage can be extended to areas where there is no direct link to the satellite. For this purpose, a good channel model is necessary.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the explosive growth of the volume and complexity of document data (e.g., news, blogs, web pages), it has become a necessity to semantically understand documents and deliver meaningful information to users. Areas dealing with these problems are crossing data mining, information retrieval, and machine learning. For example, document clustering and summarization are two fundamental techniques for understanding document data and have attracted much attention in recent years. Given a collection of documents, document clustering aims to partition them into different groups to provide efficient document browsing and navigation mechanisms. One unrevealed area in document clustering is that how to generate meaningful interpretation for the each document cluster resulted from the clustering process. Document summarization is another effective technique for document understanding, which generates a summary by selecting sentences that deliver the major or topic-relevant information in the original documents. How to improve the automatic summarization performance and apply it to newly emerging problems are two valuable research directions. To assist people to capture the semantics of documents effectively and efficiently, the dissertation focuses on developing effective data mining and machine learning algorithms and systems for (1) integrating document clustering and summarization to obtain meaningful document clusters with summarized interpretation, (2) improving document summarization performance and building document understanding systems to solve real-world applications, and (3) summarizing the differences and evolution of multiple document sources.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Structural Health Monitoring (SHM) systems were developed to evaluate the integrity of a system during operation, and to quickly identify the maintenance problems. They will be used in future aerospace vehicles to improve safety, reduce cost and minimize the maintenance time of a system. Many SHM systems were already developed to evaluate the integrity of plates and used in marine structures. Their implementation in manufacturing processes is still expected. The application of SHM methods for complex geometries and welds are two important challenges in this area of research. This research work started by studying the characteristics of piezoelectric actuators, and a small energy harvester was designed. The output voltages at different frequencies of vibration were acquired to determine the nonlinear characteristics of the piezoelectric stripe actuators. The frequency response was evaluated experimentally. AA battery size energy harvesting devices were developed by using these actuators. When the round and square cross section devices were excited at 50 Hz frequency, they generated 16 V and 25 V respectively. The Surface Response to Excitation (SuRE) and Lamb wave methods were used to estimate the condition of parts with complex geometries. Cutting tools and welded plates were considered. Both approaches used piezoelectric elements that were attached to the surfaces of considered parts. The variation of the magnitude of the frequency response was evaluated when the SuRE method was used. The sum of the square of the differences was calculated. The envelope of the received signal was used for the analysis of wave propagation. Bi-orthogonal wavelet (Binlet) analysis was also used for the evaluation of the data obtained during Lamb wave technique. Both the Lamb wave and SuRE approaches along with the three methods for data analysis worked effectively to detect increasing tool wear. Similarly, they detected defects on the plate, on the weld, and on a separate plate without any sensor as long as it was welded to the test plate.