613 resultados para Histogram quotient


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Synthesizing data from multiple studies generates hypotheses about factors that affect the distribution and abundance of species among ecosystems. Snails are dominant herbivores in many freshwater ecosystems, but there is no comprehensive review of snail density, standing stock, or body size among freshwater ecosystems. We compile data on snail density and standing stock, estimate body size with their quotient, and discuss the major pattern that emerges. We report data from 215 freshwater ecosystems taken from 88 studies that we placed into nine categories. Sixty-five studies reported density, seven reported standing stock, and 16 reported both. Despite the breadth of studies, spatial and temporal sampling scales were limited. Researchers used 25 different sampling devices ranging in area from 0.0015 to 2.5 m2. Most ecosystem categories had similar snail densities, standing stocks, and body sizes suggesting snails shared a similar function among ecosystems. Caribbean karst wetlands were a striking exception with much lower density and standing stock, but large body size. Disparity in body size results from the presence of ampullariids in Caribbean karst wetlands suggesting that biogeography affects the distribution of taxa, and in this case size, among aquatic ecosystems. We propose that resource quality explains the disparity in density and standing stock between Caribbean karst wetlands and other categories. Periphyton in Caribbean karst wetlands has high carbon-to-phosphorous ratios and defensive characteristics that inhibit grazers. Unlike many freshwater ecosystems where snails are key grazers, we hypothesize that a microbial loop captures much of the primary production in Caribbean karst wetlands.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the United States, the federal Empowerment Zone (EZ) program aimed to create and retain business investment in poor communities and to encourage local hiring through the use of special tax credits, relaxed regulations, social service grants, and other incentives. My dissertation explores whether the Round II Urban EZs had a beneficial impact on local communities and what factors influenced the implementation and performance of the EZs, using three modes of inquiry. First, linear regression models investigate whether the federal revitalization program had a statistically significant impact on the creation of new businesses and jobs in Round II Urban EZ communities. Second, location quotient and shift-share analysis are used to reveal the industry clusters in three EZ communities that experienced positive business and job growth. Third, qualitative analysis is employed to explore factors that influenced the implementation and performance of EZs in general, and in particular, Miami-Dade County, Florida. The results show an EZ's presence failed to have a significant influence on local business and job growth. In communities that experienced a beneficial impact from EZs, there has been a pattern of decline in manufacturing companies and increase in service-driven firms. The case study suggests that institutional factors, such as governance structure, leadership, administrative capacity, and community participation have affected the effectiveness of the program's implementation and performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We prove that the dimension of the 1-nullity distribution N(1) on a closed Sasakian manifold M of rankl is at least equal to 2l−1 provided that M has an isolated closed characteristic. The result is then used to provide some examples of k-contact manifolds which are not Sasakian. On a closed, 2n+1-dimensional Sasakian manifold of positive bisectional curvature, we show that either the dimension of N(1) is less than or equal to n+1 or N(1) is the entire tangent bundle TM. In the latter case, the Sasakian manifold Mis isometric to a quotient of the Euclidean sphere under a finite group of isometries. We also point out some interactions between k-nullity, Weinstein conjecture, and minimal unit vector fields.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ensemble Stream Modeling and Data-cleaning are sensor information processing systems have different training and testing methods by which their goals are cross-validated. This research examines a mechanism, which seeks to extract novel patterns by generating ensembles from data. The main goal of label-less stream processing is to process the sensed events to eliminate the noises that are uncorrelated, and choose the most likely model without over fitting thus obtaining higher model confidence. Higher quality streams can be realized by combining many short streams into an ensemble which has the desired quality. The framework for the investigation is an existing data mining tool. First, to accommodate feature extraction such as a bush or natural forest-fire event we make an assumption of the burnt area (BA*), sensed ground truth as our target variable obtained from logs. Even though this is an obvious model choice the results are disappointing. The reasons for this are two: One, the histogram of fire activity is highly skewed. Two, the measured sensor parameters are highly correlated. Since using non descriptive features does not yield good results, we resort to temporal features. By doing so we carefully eliminate the averaging effects; the resulting histogram is more satisfactory and conceptual knowledge is learned from sensor streams. Second is the process of feature induction by cross-validating attributes with single or multi-target variables to minimize training error. We use F-measure score, which combines precision and accuracy to determine the false alarm rate of fire events. The multi-target data-cleaning trees use information purity of the target leaf-nodes to learn higher order features. A sensitive variance measure such as ƒ-test is performed during each node's split to select the best attribute. Ensemble stream model approach proved to improve when using complicated features with a simpler tree classifier. The ensemble framework for data-cleaning and the enhancements to quantify quality of fitness (30% spatial, 10% temporal, and 90% mobility reduction) of sensor led to the formation of streams for sensor-enabled applications. Which further motivates the novelty of stream quality labeling and its importance in solving vast amounts of real-time mobile streams generated today.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Secrecy is fundamental to computer security, but real systems often cannot avoid leaking some secret information. For this reason, the past decade has seen growing interest in quantitative theories of information flow that allow us to quantify the information being leaked. Within these theories, the system is modeled as an information-theoretic channel that specifies the probability of each output, given each input. Given a prior distribution on those inputs, entropy-like measures quantify the amount of information leakage caused by the channel. ^ This thesis presents new results in the theory of min-entropy leakage. First, we study the perspective of secrecy as a resource that is gradually consumed by a system. We explore this intuition through various models of min-entropy consumption. Next, we consider several composition operators that allow smaller systems to be combined into larger systems, and explore the extent to which the leakage of a combined system is constrained by the leakage of its constituents. Most significantly, we prove upper bounds on the leakage of a cascade of two channels, where the output of the first channel is used as input to the second. In addition, we show how to decompose a channel into a cascade of channels. ^ We also establish fundamental new results about the recently-proposed g-leakage family of measures. These results further highlight the significance of channel cascading. We prove that whenever channel A is composition refined by channel B, that is, whenever A is the cascade of B and R for some channel R, the leakage of A never exceeds that of B, regardless of the prior distribution or leakage measure (Shannon leakage, guessing entropy leakage, min-entropy leakage, or g-leakage). Moreover, we show that composition refinement is a partial order if we quotient away channel structure that is redundant with respect to leakage alone. These results are strengthened by the proof that composition refinement is the only way for one channel to never leak more than another with respect to g-leakage. Therefore, composition refinement robustly answers the question of when a channel is always at least as secure as another from a leakage point of view.^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The role of the principal in school settings and the principal's perceived effect on student achievement have frequently been considered vital factors in school reform. The relationships between emotional intelligence, leadership style and school culture have been widely studied. The literature reveals agreement among scholars regarding the principal's vital role in developing and fostering a positive school culture. The purpose of this study was to explore the relationships between elementary school principals' emotional intelligence, leadership style and school culture. ^ The researcher implemented a non-experimental ex post facto research design to investigate four specific research hypotheses. Utilizing the Qualtrics Survey Software, 57 elementary school principals within a large urban school district in southeast Florida completed the Emotional Quotient Inventory (EQ-i), and 850 of their faculty members completed the Multifactor Leadership Questionnaire (MLQ Form 5X). Faculty responses to the school district's School Climate Survey retrieved from the district's web site were used as the measure of school culture. ^ Linear regression analyses revealed significant positive associations between emotional intelligence and the following leadership measures: Idealized Influence-Attributes (β = .23, p = < .05), Idealized Influence-Behaviors (β = .34, p = < .01), Inspirational Motivation (β = .39, p = < .01) and Contingent Reward (β = .33, p = < .01). Hierarchical regression analyses revealed positive associations between school culture and both transformational and transactional leadership measures, and negative associations between school culture and passive-avoidant leadership measures. Significant positive associations were found between school culture and the principals' emotional intelligence over and above leadership style. Hierarchical linear regressions to test the statistical hypothesis developed to account for alternative explanations revealed significant associations between leadership style and school culture over and above school grade. ^ These results suggest that emotional intelligence merits consideration in the development of leadership theory. Practical implications include suggestions that principals employ both transformational and transactional leadership strategies, and focus on developing their level of emotional intelligence. The associations between emotional intelligence, transformational leadership, Contingent Reward and school culture found in this study validate the role of the principal as the leader of school reform.^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation develops a new mathematical approach that overcomes the effect of a data processing phenomenon known as "histogram binning" inherent to flow cytometry data. A real-time procedure is introduced to prove the effectiveness and fast implementation of such an approach on real-world data. The histogram binning effect is a dilemma posed by two seemingly antagonistic developments: (1) flow cytometry data in its histogram form is extended in its dynamic range to improve its analysis and interpretation, and (2) the inevitable dynamic range extension introduces an unwelcome side effect, the binning effect, which skews the statistics of the data, undermining as a consequence the accuracy of the analysis and the eventual interpretation of the data. Researchers in the field contended with such a dilemma for many years, resorting either to hardware approaches that are rather costly with inherent calibration and noise effects; or have developed software techniques based on filtering the binning effect but without successfully preserving the statistical content of the original data. The mathematical approach introduced in this dissertation is so appealing that a patent application has been filed. The contribution of this dissertation is an incremental scientific innovation based on a mathematical framework that will allow researchers in the field of flow cytometry to improve the interpretation of data knowing that its statistical meaning has been faithfully preserved for its optimized analysis. Furthermore, with the same mathematical foundation, proof of the origin of such an inherent artifact is provided. These results are unique in that new mathematical derivations are established to define and solve the critical problem of the binning effect faced at the experimental assessment level, providing a data platform that preserves its statistical content. In addition, a novel method for accumulating the log-transformed data was developed. This new method uses the properties of the transformation of statistical distributions to accumulate the output histogram in a non-integer and multi-channel fashion. Although the mathematics of this new mapping technique seem intricate, the concise nature of the derivations allow for an implementation procedure that lends itself to a real-time implementation using lookup tables, a task that is also introduced in this dissertation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Lung cancer is one of the most common types of cancer and has the highest mortality rate. Patient survival is highly correlated with early detection. Computed Tomography technology services the early detection of lung cancer tremendously by offering aminimally invasive medical diagnostic tool. However, the large amount of data per examination makes the interpretation difficult. This leads to omission of nodules by human radiologist. This thesis presents a development of a computer-aided diagnosis system (CADe) tool for the detection of lung nodules in Computed Tomography study. The system, called LCD-OpenPACS (Lung Cancer Detection - OpenPACS) should be integrated into the OpenPACS system and have all the requirements for use in the workflow of health facilities belonging to the SUS (Brazilian health system). The LCD-OpenPACS made use of image processing techniques (Region Growing and Watershed), feature extraction (Histogram of Gradient Oriented), dimensionality reduction (Principal Component Analysis) and classifier (Support Vector Machine). System was tested on 220 cases, totaling 296 pulmonary nodules, with sensitivity of 94.4% and 7.04 false positives per case. The total time for processing was approximately 10 minutes per case. The system has detected pulmonary nodules (solitary, juxtavascular, ground-glass opacity and juxtapleural) between 3 mm and 30 mm.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To explore the relationship between memory and early school performance, we used graph theory to investigate memory reports from 76 children aged 6–8 years. The reports comprised autobiographical memories of events days to years past, and memories of novel images reported immediately after encoding. We also measured intelligence quotient (IQ) and theory of mind (ToM). Reading and Mathematics were assessed before classes began (December 2013), around the time of report collection (June 2014), and at the end of the academic year (December 2014). IQ and ToM correlated positively with word diversity and word-to-word connectivity, and negatively with word recurrence. Connectivity correlated positively with Reading in June 2014 as well as December 2014, even after adjusting for IQ and ToM. To our knowledge, this is the first study demonstrating a link between the structure of children’s memories and their cognitive or academic performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To explore the relationship between memory and early school performance, we used graph theory to investigate memory reports from 76 children aged 6–8 years. The reports comprised autobiographical memories of events days to years past, and memories of novel images reported immediately after encoding. We also measured intelligence quotient (IQ) and theory of mind (ToM). Reading and Mathematics were assessed before classes began (December 2013), around the time of report collection (June 2014), and at the end of the academic year (December 2014). IQ and ToM correlated positively with word diversity and word-to-word connectivity, and negatively with word recurrence. Connectivity correlated positively with Reading in June 2014 as well as December 2014, even after adjusting for IQ and ToM. To our knowledge, this is the first study demonstrating a link between the structure of children’s memories and their cognitive or academic performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract

The goal of modern radiotherapy is to precisely deliver a prescribed radiation dose to delineated target volumes that contain a significant amount of tumor cells while sparing the surrounding healthy tissues/organs. Precise delineation of treatment and avoidance volumes is the key for the precision radiation therapy. In recent years, considerable clinical and research efforts have been devoted to integrate MRI into radiotherapy workflow motivated by the superior soft tissue contrast and functional imaging possibility. Dynamic contrast-enhanced MRI (DCE-MRI) is a noninvasive technique that measures properties of tissue microvasculature. Its sensitivity to radiation-induced vascular pharmacokinetic (PK) changes has been preliminary demonstrated. In spite of its great potential, two major challenges have limited DCE-MRI’s clinical application in radiotherapy assessment: the technical limitations of accurate DCE-MRI imaging implementation and the need of novel DCE-MRI data analysis methods for richer functional heterogeneity information.

This study aims at improving current DCE-MRI techniques and developing new DCE-MRI analysis methods for particular radiotherapy assessment. Thus, the study is naturally divided into two parts. The first part focuses on DCE-MRI temporal resolution as one of the key DCE-MRI technical factors, and some improvements regarding DCE-MRI temporal resolution are proposed; the second part explores the potential value of image heterogeneity analysis and multiple PK model combination for therapeutic response assessment, and several novel DCE-MRI data analysis methods are developed.

I. Improvement of DCE-MRI temporal resolution. First, the feasibility of improving DCE-MRI temporal resolution via image undersampling was studied. Specifically, a novel MR image iterative reconstruction algorithm was studied for DCE-MRI reconstruction. This algorithm was built on the recently developed compress sensing (CS) theory. By utilizing a limited k-space acquisition with shorter imaging time, images can be reconstructed in an iterative fashion under the regularization of a newly proposed total generalized variation (TGV) penalty term. In the retrospective study of brain radiosurgery patient DCE-MRI scans under IRB-approval, the clinically obtained image data was selected as reference data, and the simulated accelerated k-space acquisition was generated via undersampling the reference image full k-space with designed sampling grids. Two undersampling strategies were proposed: 1) a radial multi-ray grid with a special angular distribution was adopted to sample each slice of the full k-space; 2) a Cartesian random sampling grid series with spatiotemporal constraints from adjacent frames was adopted to sample the dynamic k-space series at a slice location. Two sets of PK parameters’ maps were generated from the undersampled data and from the fully-sampled data, respectively. Multiple quantitative measurements and statistical studies were performed to evaluate the accuracy of PK maps generated from the undersampled data in reference to the PK maps generated from the fully-sampled data. Results showed that at a simulated acceleration factor of four, PK maps could be faithfully calculated from the DCE images that were reconstructed using undersampled data, and no statistically significant differences were found between the regional PK mean values from undersampled and fully-sampled data sets. DCE-MRI acceleration using the investigated image reconstruction method has been suggested as feasible and promising.

Second, for high temporal resolution DCE-MRI, a new PK model fitting method was developed to solve PK parameters for better calculation accuracy and efficiency. This method is based on a derivative-based deformation of the commonly used Tofts PK model, which is presented as an integrative expression. This method also includes an advanced Kolmogorov-Zurbenko (KZ) filter to remove the potential noise effect in data and solve the PK parameter as a linear problem in matrix format. In the computer simulation study, PK parameters representing typical intracranial values were selected as references to simulated DCE-MRI data for different temporal resolution and different data noise level. Results showed that at both high temporal resolutions (<1s) and clinically feasible temporal resolution (~5s), this new method was able to calculate PK parameters more accurate than the current calculation methods at clinically relevant noise levels; at high temporal resolutions, the calculation efficiency of this new method was superior to current methods in an order of 102. In a retrospective of clinical brain DCE-MRI scans, the PK maps derived from the proposed method were comparable with the results from current methods. Based on these results, it can be concluded that this new method can be used for accurate and efficient PK model fitting for high temporal resolution DCE-MRI.

II. Development of DCE-MRI analysis methods for therapeutic response assessment. This part aims at methodology developments in two approaches. The first one is to develop model-free analysis method for DCE-MRI functional heterogeneity evaluation. This approach is inspired by the rationale that radiotherapy-induced functional change could be heterogeneous across the treatment area. The first effort was spent on a translational investigation of classic fractal dimension theory for DCE-MRI therapeutic response assessment. In a small-animal anti-angiogenesis drug therapy experiment, the randomly assigned treatment/control groups received multiple fraction treatments with one pre-treatment and multiple post-treatment high spatiotemporal DCE-MRI scans. In the post-treatment scan two weeks after the start, the investigated Rényi dimensions of the classic PK rate constant map demonstrated significant differences between the treatment and the control groups; when Rényi dimensions were adopted for treatment/control group classification, the achieved accuracy was higher than the accuracy from using conventional PK parameter statistics. Following this pilot work, two novel texture analysis methods were proposed. First, a new technique called Gray Level Local Power Matrix (GLLPM) was developed. It intends to solve the lack of temporal information and poor calculation efficiency of the commonly used Gray Level Co-Occurrence Matrix (GLCOM) techniques. In the same small animal experiment, the dynamic curves of Haralick texture features derived from the GLLPM had an overall better performance than the corresponding curves derived from current GLCOM techniques in treatment/control separation and classification. The second developed method is dynamic Fractal Signature Dissimilarity (FSD) analysis. Inspired by the classic fractal dimension theory, this method measures the dynamics of tumor heterogeneity during the contrast agent uptake in a quantitative fashion on DCE images. In the small animal experiment mentioned before, the selected parameters from dynamic FSD analysis showed significant differences between treatment/control groups as early as after 1 treatment fraction; in contrast, metrics from conventional PK analysis showed significant differences only after 3 treatment fractions. When using dynamic FSD parameters, the treatment/control group classification after 1st treatment fraction was improved than using conventional PK statistics. These results suggest the promising application of this novel method for capturing early therapeutic response.

The second approach of developing novel DCE-MRI methods is to combine PK information from multiple PK models. Currently, the classic Tofts model or its alternative version has been widely adopted for DCE-MRI analysis as a gold-standard approach for therapeutic response assessment. Previously, a shutter-speed (SS) model was proposed to incorporate transcytolemmal water exchange effect into contrast agent concentration quantification. In spite of richer biological assumption, its application in therapeutic response assessment is limited. It might be intriguing to combine the information from the SS model and from the classic Tofts model to explore potential new biological information for treatment assessment. The feasibility of this idea was investigated in the same small animal experiment. The SS model was compared against the Tofts model for therapeutic response assessment using PK parameter regional mean value comparison. Based on the modeled transcytolemmal water exchange rate, a biological subvolume was proposed and was automatically identified using histogram analysis. Within the biological subvolume, the PK rate constant derived from the SS model were proved to be superior to the one from Tofts model in treatment/control separation and classification. Furthermore, novel biomarkers were designed to integrate PK rate constants from these two models. When being evaluated in the biological subvolume, this biomarker was able to reflect significant treatment/control difference in both post-treatment evaluation. These results confirm the potential value of SS model as well as its combination with Tofts model for therapeutic response assessment.

In summary, this study addressed two problems of DCE-MRI application in radiotherapy assessment. In the first part, a method of accelerating DCE-MRI acquisition for better temporal resolution was investigated, and a novel PK model fitting algorithm was proposed for high temporal resolution DCE-MRI. In the second part, two model-free texture analysis methods and a multiple-model analysis method were developed for DCE-MRI therapeutic response assessment. The presented works could benefit the future DCE-MRI routine clinical application in radiotherapy assessment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

First-order transitions of system where both lattice site occupancy and lattice spacing fluctuate, such as cluster crystals, cannot be efficiently studied by traditional simulation methods, which necessarily fix one of these two degrees of freedom. The difficulty, however, can be surmounted by the generalized [N]pT ensemble [J. Chem. Phys. 136, 214106 (2012)]. Here we show that histogram reweighting and the [N]pT ensemble can be used to study an isostructural transition between cluster crystals of different occupancy in the generalized exponential model of index 4 (GEM-4). Extending this scheme to finite-size scaling studies also allows us to accurately determine the critical point parameters and to verify that it belongs to the Ising universality class.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Oil polluted and not oil polluted soils (crude oil hydrocarbons contents: 20-92500 mg/kg dry soil mass) under natural grass and forest vegetation and in a bog in the Russian tundra were compared in their principal soil ecological parameters, the oil content and the microbial indicators. CFE biomass-C, dehydrogenase and arylsulfatase activity were enhanced with the occurrence of crude oil. Using these parameters for purposes of controlling remediation and recultivation success it is not possible to distinguish bctween promotion of microbial activity by oil carbon or soil organic carbon (SOC). For this reason we think that these parameters are not appropriate to indicate a soil damage by an oil impact. In contrast the metabolie quotient (qC02), calculated as the ratio between soil basal respiration and the SIR biomass-C was adequate to indicate a high crude oil contamination in soil. Also, the ß-glucosidase activity (parameter ß-GL/SOC) was correlated negatively with oil in soil. The indication of a soil damage by using the stress parameter qCO, or the specific enzyme activities (activity/SOC) minimizes the promotion effect of the recent SOC content on microbial parameters. Both biomass methods (SIR, CFE) have technical problems in application for crude oil-contaminated and subarctic soils. CFE does not reflect the low C_mic level of the cold tundra soils. We recommend to test every method for its suitability before any data collection in series as well as application for cold soils and the application of ecophysiological ratios as R_mic/C_mic, C_mic/SOC or enzymatic activity/SOC instead of absolute data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Global warming is real and has been with us for at least two decades. Questions arise regarding the response of the ocean to greenhouse forcing, including expectations for changes in ocean circulation, in uptake of excess carbon dioxide, and in upwelling activity. The large climate variations of the ice ages, within the last million years, offer the opportunity to study responses of the ocean to climate change. A histogram of sealevel positions for the last 700,000 years (based on a new d/sup 18/O stratigraphy here compiled) shows that the present is near the margin of the range of fluctuations, with only 6 percent of positions indicating a warmer climate. Thus, the future will be largely outside of experience with regard to fluctuations of the recent geologic past. The same is true for greenhouse forcing. Our inability to explain sudden climate change in the past, including the rapid rise of carbon dioxide during deglaciation, and differences in ocean productivity between glacial and interglacial conditions, demonstrates a lack of understanding that makes predictions suspect. This is the lesson from ice age studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Carbon dioxide and oxygen fluxes were measured in 0.2 m2 enclosures placed at the water sediment interface in the SW lagoon of New Caledonia. Experiments, performed at several stations in a wide range of environments, were carried out both in darkness to estimate respiration and at ambient light, to assess the effects of primary production. The community respiratory quotient (CRQ = CO2 production rate/02 consumption rate) and the community photosynthetic quotient (CPQ= gross O2 production rate/gross CO2 consumption rate) were calculated by functional regressions. The CRQ value, calculated from 61 incubations, was 1.14 (S.E. 0.05) and the CPQ value, obtained from 18 incubations, was 1.03 (S.E. 0.08). The linearity of the relationship between the O2 and the CO2 fluxes suggests that these values are representative for the whole lagoon