875 resultados para Voltage disturbance detection and classification
Resumo:
How are the image statistics of global image contrast computed? We answered this by using a contrast-matching task for checkerboard configurations of ‘battenberg’ micro-patterns where the contrasts and spatial spreads of interdigitated pairs of micro-patterns were adjusted independently. Test stimuli were 20 × 20 arrays with various sized cluster widths, matched to standard patterns of uniform contrast. When one of the test patterns contained a pattern with much higher contrast than the other, that determined global pattern contrast, as in a max() operation. Crucially, however, the full matching functions had a curious intermediate region where low contrast additions for one pattern to intermediate contrasts of the other caused a paradoxical reduction in perceived global contrast. None of the following models predicted this: RMS, energy, linear sum, max, Legge and Foley. However, a gain control model incorporating wide-field integration and suppression of nonlinear contrast responses predicted the results with no free parameters. This model was derived from experiments on summation of contrast at threshold, and masking and summation effects in dipper functions. Those experiments were also inconsistent with the failed models above. Thus, we conclude that our contrast gain control model (Meese & Summers, 2007) describes a fundamental operation in human contrast vision.
Resumo:
Holistic face perception, i.e. the mandatory integration of featural information across the face, hasbeen considered to play a key role when recognizing emotional face expressions (e.g., Tanaka et al.,2002). However, despite their early onset holistic processing skills continue to improvethroughout adolescence (e.g., Schwarzer et al., 2010) and therefore might modulate theevaluation of facial expressions. We tested this hypothesis using an attentional blink (AB)paradigm to compare the impact of happy, fearful and neutral faces in adolescents (10–13 years)and adults on subsequently presented neutral target stimuli (animals, plants and objects) in a rapidserial visual presentation stream. Adolescents and adults were found to be equally reliable whenreporting the emotional expression of the face stimuli. However, the detection of emotional butnot neutral faces imposed a significantly stronger AB effect on the detection of the neutral targetsin adults compared to adolescents. In a control experiment we confirmed that adolescents ratedemotional faces lower in terms of valence and arousal than adults. The results suggest a protracteddevelopment of the ability to evaluate facial expressions that might be attributed to the latematuration of holistic processing skills.
Resumo:
Az egyes nemzetek számviteli szabályozásának vizsgálatánál az adott ország sajátosságaiból eredően részben eltérő szabályozások alakultak ki. Az induktív megközelítésű vizsgálatok jellemzően a szabályozási kérdések széles körét fogják át, de csak néhány tényező mentén közelítve. A cash flow-kimutatások témakörénél a legtöbbször csak azt nézték, hogy van-e előírás a kimutatás elkészítésére, de a részletekkel már kevésbé foglalkoztak. Ebből adódóan e területen viszonylag kis különbséget mutattak ki ezek a felmérések. A szerző kutatása szerint a nemzeti cash flow-kimutatások szabályozásának részleteiben eltérések tapasztalhatók, és ezek alapján a nemzetek klaszterelemzéssel hierarchikusan csoportokba rendezhetők. _____ Research has found that as a result of their particularities, different countries have established partly different accounting frameworks. Studies with inductive approaches typically encompass a wide range of regulatory issues, but based on a limited number of factors only. In the case of Statements of Cash Flows, most studies have so far only examined the existence of rules governing the presentation of the statement, without an in-depth analysis of the details. Therefore, these studies only found relatively minor differences in this field. The author’s research shows that many differences exist in the details of national Cash Flow Statement regulations, which makes it possible to classify the countries in groups using the method of hierarchical clustering.
Resumo:
Raster graphic ampelometric software was not exclusively developed for the estimation of leaf area, but also for the characterization of grapevine (Viti vinifera L.) leaves. The software was written in C-Hprogramming language, using the C-1-1- Builder 2007 for Windows 95-XP and Linux operation systems. It handles desktop-scanned images. On the image analysed with the GRA.LE.D., the user has to determine 11 points. These points are then connected and the distances between them calculated. The GRA.LE.D. software supports standard ampelometric measurements such as leaf area, angles between the veins and lengths of the veins. These measurements are recorded by the software and exported into plain ASCII text files for single or multiple samples. Twenty-two biometric data points of each leaf are identified by the GRA.LE.D. It presents the opportunity to statistically analyse experimental data, allows comparison of cultivars and enables graphic reconstruction of leaves using the Microsoft Excel Chart Wizard. The GRA. LE.D. was thoroughly calibrated and compared to other widely used instruments and methods such as photo-gravimetry, LiCor L0100, WinDIAS2.0 and ImageTool. By comparison, the GRA.LE.D. presented the most accurate measurements of leaf area, but the LiCor L0100 and the WinDIAS2.0 were faster, while the photo-gravimetric method proved to be the most time-consuming. The WinDIAS2.0 instrument was the least reliable. The GRA.LE.D. is uncomplicated, user-friendly, accurate, consistent, reliable and has wide practical application.
Resumo:
Recent advances in airborne Light Detection and Ranging (LIDAR) technology allow rapid and inexpensive measurements of topography over large areas. Airborne LIDAR systems usually return a 3-dimensional cloud of point measurements from reflective objects scanned by the laser beneath the flight path. This technology is becoming a primary method for extracting information of different kinds of geometrical objects, such as high-resolution digital terrain models (DTMs), buildings and trees, etc. In the past decade, LIDAR gets more and more interest from researchers in the field of remote sensing and GIS. Compared to the traditional data sources, such as aerial photography and satellite images, LIDAR measurements are not influenced by sun shadow and relief displacement. However, voluminous data pose a new challenge for automated extraction the geometrical information from LIDAR measurements because many raster image processing techniques cannot be directly applied to irregularly spaced LIDAR points. ^ In this dissertation, a framework is proposed to filter out information about different kinds of geometrical objects, such as terrain and buildings from LIDAR automatically. They are essential to numerous applications such as flood modeling, landslide prediction and hurricane animation. The framework consists of several intuitive algorithms. Firstly, a progressive morphological filter was developed to detect non-ground LIDAR measurements. By gradually increasing the window size and elevation difference threshold of the filter, the measurements of vehicles, vegetation, and buildings are removed, while ground data are preserved. Then, building measurements are identified from no-ground measurements using a region growing algorithm based on the plane-fitting technique. Raw footprints for segmented building measurements are derived by connecting boundary points and are further simplified and adjusted by several proposed operations to remove noise, which is caused by irregularly spaced LIDAR measurements. To reconstruct 3D building models, the raw 2D topology of each building is first extracted and then further adjusted. Since the adjusting operations for simple building models do not work well on 2D topology, 2D snake algorithm is proposed to adjust 2D topology. The 2D snake algorithm consists of newly defined energy functions for topology adjusting and a linear algorithm to find the minimal energy value of 2D snake problems. Data sets from urbanized areas including large institutional, commercial, and small residential buildings were employed to test the proposed framework. The results demonstrated that the proposed framework achieves a very good performance. ^
Resumo:
Refuge habitats increase survival rate and recovery time of populations experiencing environmental disturbance, but limits on the ability of refuges to buffer communities are poorly understood. We hypothesized that importance of refuges in preventing population declines and alteration in community structure has a non-linear relationship with severity of disturbance. In the Florida Everglades, alligator ponds are used as refuge habitat by fishes during seasonal drying of marsh habitats. Using an 11-year record of hydrological conditions and fish abundance in 10 marshes and 34 alligator ponds from two regions of the Everglades, we sought to characterize patterns of refuge use and temporal dynamics of fish abundance and community structure across changing intensity, duration, and frequency of drought disturbance. Abundance in alligator ponds was positively related to refuge size, distance from alternative refugia (e.g. canals), and abundance in surrounding marsh prior to hydrologic disturbance. Variables negatively related to abundance in alligator ponds included water level in surrounding marsh and abundance of disturbance-tolerant species. Refuge community structure did not differ between regions because the same subset of species in both regions used alligator ponds during droughts. When time between disturbances was short, fish abundance declined in marshes, and in the region with the most spatially extensive pattern of disturbance, community structure was altered in both marshes and alligator ponds because of an increased proportion of species more resistant to disturbance. These changes in community structure were associated with increases in both duration and frequency of hydrologic disturbance. Use of refuge habitat had a modal relationship with severity of disturbance regime. Spatial patterns of response suggest that decline in refuge use was because of decreased effectiveness of refuge habitat in reducing mortality and providing sufficient time for recovery for fish communities experiencing reduced time between disturbance events.
Resumo:
Models of community regulation commonly incorporate gradients of disturbance inversely related to the role of biotic interactions in regulating intermediate trophic levels. Higher trophic-level organisms are predicted to be more strongly limited by intermediate levels of disturbance than are the organisms they consume. We used a manipulation of the frequency of hydrological disturbance in an intervention analysis to examine its effects on small-fish communities in the Everglades, USA. From 1978 to 2002, we monitored fishes at one long-hydroperiod (average 350 days) and at one short-hydroperiod (average 259 days; monitoring started here in 1985) site. At a third site, managers intervened in 1985 to diminish the frequency and duration of marsh drying. By the late 1990s, the successional dynamics of density and relative abundance at the intervention site converged on those of the long-hydroperiod site. Community change was manifested over 3 to 5 years following a dry-down if a site remained inundated; the number of days since the most recent drying event and length of the preceding dry period were useful for predicting population dynamics. Community dissimilarity was positively correlated with the time since last dry. Community dynamics resulted from change in the relative abundance of three groups of species linked by life-history responses to drought. Drought frequency and intensity covaried in response to hydrological manipulation at the landscape scale; community-level successional dynamics converged on a relatively small range of species compositions when drought return-time extended beyond 4 years. The density of small fishes increased with diminution of drought frequency, consistent with disturbance-limited community structure; less-frequent drying than experienced in this study (i.e., longer return times) yields predator-dominated regulation of small-fish communities in some parts of the Everglades.
Resumo:
Airborne LIDAR (Light Detecting and Ranging) is a relatively new technique that rapidly and accurately measures micro-topographic features. This study compares topography derived from LIDAR with subsurface karst structures mapped in 3-dimensions with ground penetrating radar (GPR). Over 500 km of LIDAR data were collected in 1995 by the NASA ATM instrument. The LIDAR data was processed and analyzed to identify closed depressions. A GPR survey was then conducted at a 200 by 600 m site to determine if the target features are associated with buried karst structures. The GPR survey resolved two major depressions in the top of a clay rich layer at ~10m depth. These features are interpreted as buried dolines and are associated spatially with subtle (< 1m) trough-like depressions in the topography resolved from the LIDAR data. This suggests that airborne LIDAR may be a useful tool for indirectly detecting subsurface features associated with sinkhole hazard.
Resumo:
The presence of inhibitory substances in biological forensic samples has, and continues to affect the quality of the data generated following DNA typing processes. Although the chemistries used during the procedures have been enhanced to mitigate the effects of these deleterious compounds, some challenges remain. Inhibitors can be components of the samples, the substrate where samples were deposited or chemical(s) associated to the DNA purification step. Therefore, a thorough understanding of the extraction processes and their ability to handle the various types of inhibitory substances can help define the best analytical processing for any given sample. A series of experiments were conducted to establish the inhibition tolerance of quantification and amplification kits using common inhibitory substances in order to determine if current laboratory practices are optimal for identifying potential problems associated with inhibition. DART mass spectrometry was used to determine the amount of inhibitor carryover after sample purification, its correlation to the initial inhibitor input in the sample and the overall effect in the results. Finally, a novel alternative at gathering investigative leads from samples that would otherwise be ineffective for DNA typing due to the large amounts of inhibitory substances and/or environmental degradation was tested. This included generating data associated with microbial peak signatures to identify locations of clandestine human graves. Results demonstrate that the current methods for assessing inhibition are not necessarily accurate, as samples that appear inhibited in the quantification process can yield full DNA profiles, while those that do not indicate inhibition may suffer from lowered amplification efficiency or PCR artifacts. The extraction methods tested were able to remove >90% of the inhibitors from all samples with the exception of phenol, which was present in variable amounts whenever the organic extraction approach was utilized. Although the results attained suggested that most inhibitors produce minimal effect on downstream applications, analysts should practice caution when selecting the best extraction method for particular samples, as casework DNA samples are often present in small quantities and can contain an overwhelming amount of inhibitory substances.
Resumo:
In the last decade, large numbers of social media services have emerged and been widely used in people's daily life as important information sharing and acquisition tools. With a substantial amount of user-contributed text data on social media, it becomes a necessity to develop methods and tools for text analysis for this emerging data, in order to better utilize it to deliver meaningful information to users. ^ Previous work on text analytics in last several decades is mainly focused on traditional types of text like emails, news and academic literatures, and several critical issues to text data on social media have not been well explored: 1) how to detect sentiment from text on social media; 2) how to make use of social media's real-time nature; 3) how to address information overload for flexible information needs. ^ In this dissertation, we focus on these three problems. First, to detect sentiment of text on social media, we propose a non-negative matrix tri-factorization (tri-NMF) based dual active supervision method to minimize human labeling efforts for the new type of data. Second, to make use of social media's real-time nature, we propose approaches to detect events from text streams on social media. Third, to address information overload for flexible information needs, we propose two summarization framework, dominating set based summarization framework and learning-to-rank based summarization framework. The dominating set based summarization framework can be applied for different types of summarization problems, while the learning-to-rank based summarization framework helps utilize the existing training data to guild the new summarization tasks. In addition, we integrate these techneques in an application study of event summarization for sports games as an example of how to better utilize social media data. ^
Resumo:
The objective of this study is to design and development of an enzyme-linked biosensor for detection and quantification of phosphate species. Various concentrations of phosphate species were tested and completed for this study. Phosphate is one of the vital nutrients for all living organisms. Phosphate compounds can be found in nature (e.g., water sediments), and they often exist in aninorganic form. The amount of phosphates in the environment strongly influences the operations of living organisms. Excess amount of phosphate in the environment causes eutrophication which in turn causes oxygen deficit for the other living organisms. Fish die and degradation of habitat in the water occurs as a result of eutrophication. In contrast, low phosphate concentration causes death of vegetation since plants utilize the inorganic phosphate for photosynthesis, respiration, and regulation of enzymes. Therefore, the phosphate quantity in lakes and rivers must be monitored. Result demonstrated that phosphate species could be detected in various organisms via enzyme-linked biosensor in this research.
Resumo:
In the last decade, large numbers of social media services have emerged and been widely used in people's daily life as important information sharing and acquisition tools. With a substantial amount of user-contributed text data on social media, it becomes a necessity to develop methods and tools for text analysis for this emerging data, in order to better utilize it to deliver meaningful information to users. Previous work on text analytics in last several decades is mainly focused on traditional types of text like emails, news and academic literatures, and several critical issues to text data on social media have not been well explored: 1) how to detect sentiment from text on social media; 2) how to make use of social media's real-time nature; 3) how to address information overload for flexible information needs. In this dissertation, we focus on these three problems. First, to detect sentiment of text on social media, we propose a non-negative matrix tri-factorization (tri-NMF) based dual active supervision method to minimize human labeling efforts for the new type of data. Second, to make use of social media's real-time nature, we propose approaches to detect events from text streams on social media. Third, to address information overload for flexible information needs, we propose two summarization framework, dominating set based summarization framework and learning-to-rank based summarization framework. The dominating set based summarization framework can be applied for different types of summarization problems, while the learning-to-rank based summarization framework helps utilize the existing training data to guild the new summarization tasks. In addition, we integrate these techneques in an application study of event summarization for sports games as an example of how to better utilize social media data.
Resumo:
Rapid development in industry have contributed to more complex systems that are prone to failure. In applications where the presence of faults may lead to premature failure, fault detection and diagnostics tools are often implemented. The goal of this research is to improve the diagnostic ability of existing FDD methods. Kernel Principal Component Analysis has good fault detection capability, however it can only detect the fault and identify few variables that have contribution on occurrence of fault and thus not precise in diagnosing. Hence, KPCA was used to detect abnormal events and the most contributed variables were taken out for more analysis in diagnosis phase. The diagnosis phase was done in both qualitative and quantitative manner. In qualitative mode, a networked-base causality analysis method was developed to show the causal effect between the most contributing variables in occurrence of the fault. In order to have more quantitative diagnosis, a Bayesian network was constructed to analyze the problem in probabilistic perspective.
Resumo:
Recent studies have reported that flanking stimuli broaden the psychometric function and lower detection thresholds. In the present study, we measured psychometric functions for detection and discrimination with and without flankers to investigate whether these effects occur throughout the contrast continuum. Our results confirm that lower detection thresholds with flankers are accompanied by broader psychometric functions. Psychometric functions for discrimination reveal that discrimination thresholds with and without flankers are similar across standard levels, and that the broadening of psychometric functions with flankers disappears as standard contrast increases, to the point that psychometric functions at high standard levels are virtually identical with or without flankers. Threshold-versus-contrast (TvC) curves with flankers only differ from TvC curves without flankers in occasional shallower dippers and lower branches on the left of the dipper, but they run virtually superimposed at high standard levels. We discuss differences between our results and other results in the literature, and how they are likely attributed to the differential vulnerability of alternative psychophysical procedures to the effects of presentation order. We show that different models of flanker facilitation can fit the data equally well, which stresses that succeeding at fitting a model does not validate it in any sense.