947 resultados para modeling and model calibration


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ensemble Stream Modeling and Data-cleaning are sensor information processing systems have different training and testing methods by which their goals are cross-validated. This research examines a mechanism, which seeks to extract novel patterns by generating ensembles from data. The main goal of label-less stream processing is to process the sensed events to eliminate the noises that are uncorrelated, and choose the most likely model without over fitting thus obtaining higher model confidence. Higher quality streams can be realized by combining many short streams into an ensemble which has the desired quality. The framework for the investigation is an existing data mining tool. First, to accommodate feature extraction such as a bush or natural forest-fire event we make an assumption of the burnt area (BA*), sensed ground truth as our target variable obtained from logs. Even though this is an obvious model choice the results are disappointing. The reasons for this are two: One, the histogram of fire activity is highly skewed. Two, the measured sensor parameters are highly correlated. Since using non descriptive features does not yield good results, we resort to temporal features. By doing so we carefully eliminate the averaging effects; the resulting histogram is more satisfactory and conceptual knowledge is learned from sensor streams. Second is the process of feature induction by cross-validating attributes with single or multi-target variables to minimize training error. We use F-measure score, which combines precision and accuracy to determine the false alarm rate of fire events. The multi-target data-cleaning trees use information purity of the target leaf-nodes to learn higher order features. A sensitive variance measure such as ƒ-test is performed during each node's split to select the best attribute. Ensemble stream model approach proved to improve when using complicated features with a simpler tree classifier. The ensemble framework for data-cleaning and the enhancements to quantify quality of fitness (30% spatial, 10% temporal, and 90% mobility reduction) of sensor led to the formation of streams for sensor-enabled applications. Which further motivates the novelty of stream quality labeling and its importance in solving vast amounts of real-time mobile streams generated today.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A description and model of the near-surface hydrothermal system at Casa Diablo, with its implications for the larger-scale hydrothermal system of Long Valley, California, is presented. The data include resistivity profiles with penetrations to three different depth ranges, and analyses of inorganic mercury concentrations in 144 soil samples taken over a 1.3 by 1.7 km area. Analyses of the data together with the mapping of active surface hydrothermal features (fumaroles, mudpots, etc.), has revealed that the relationship between the hydrothermal system, surface hydrothermal activity, and mercury anomalies is strongly controlled by faults and topography. There are, however, more subtle factors responsible for the location of many active and anomalous zones such as fractures, zones of high permeability, and interactions between hydrothermal and cooler groundwater. In addition, the near-surface location of the upwelling from the deep hydrothermal reservoir, which supplies the geothermal power plants at Casa Diablo and the numerous hot pools in the caldera with hydrothermal water, has been detected. The data indicate that after upwelling the hydrothermal water flows eastward at shallow depth for at least 2 km and probably continues another 10 km to the east, all the way to Lake Crowley.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this research was to apply model checking by using a symbolic model checker on Predicate Transition Nets (PrT Nets). A PrT Net is a formal model of information flow which allows system properties to be modeled and analyzed. The aim of this thesis was to use the modeling and analysis power of PrT nets to provide a mechanism for the system model to be verified. Symbolic Model Verifier (SMV) was the model checker chosen in this thesis, and in order to verify the PrT net model of a system, it was translated to SMV input language. A software tool was implemented which translates the PrT Net into SMV language, hence enabling the process of model checking. The system includes two parts: the PrT net editor where the representation of a system can be edited, and the translator which converts the PrT net into an SMV program.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ensuring the correctness of software has been the major motivation in software research, constituting a Grand Challenge. Due to its impact in the final implementation, one critical aspect of software is its architectural design. By guaranteeing a correct architectural design, major and costly flaws can be caught early on in the development cycle. Software architecture design has received a lot of attention in the past years, with several methods, techniques and tools developed. However, there is still more to be done, such as providing adequate formal analysis of software architectures. On these regards, a framework to ensure system dependability from design to implementation has been developed at FIU (Florida International University). This framework is based on SAM (Software Architecture Model), an ADL (Architecture Description Language), that allows hierarchical compositions of components and connectors, defines an architectural modeling language for the behavior of components and connectors, and provides a specification language for the behavioral properties. The behavioral model of a SAM model is expressed in the form of Petri nets and the properties in first order linear temporal logic. This dissertation presents a formal verification and testing approach to guarantee the correctness of Software Architectures. The Software Architectures studied are expressed in SAM. For the formal verification approach, the technique applied was model checking and the model checker of choice was Spin. As part of the approach, a SAM model is formally translated to a model in the input language of Spin and verified for its correctness with respect to temporal properties. In terms of testing, a testing approach for SAM architectures was defined which includes the evaluation of test cases based on Petri net testing theory to be used in the testing process at the design level. Additionally, the information at the design level is used to derive test cases for the implementation level. Finally, a modeling and analysis tool (SAM tool) was implemented to help support the design and analysis of SAM models. The results show the applicability of the approach to testing and verification of SAM models with the aid of the SAM tool.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ultrasonic P wavc transmission seismograms recorded on sediment cores have been analyzed to study the acoustic and estimate the clastic properties of marine sediments from different provinces dominated by terrigenous, calcareous, amI diatomaceous sedimentation. Instantaneous frequencies computed from the transmission seismograms are displayed as gray-shaded images to give an acoustic overview of the lithology of each core. Ccntirneter-scale variations in the ultrasonic waveforms associated with lithological changes are illustrated by wiggle traces in detail. Cross-correlation, multiple-filter, and spectral ratio techniques are applied to derive P wave velocities and attenuation coefficients. S wave velocities and attenuation coefficients, elastic moduli, and permeabilities are calculated by an inversion scheme based on the Biot-Stoll viscoelastic model. Together wilh porosity measurements, P and S wave scatter diagrams are constructed to characterize different sediment types by their velocity- and attenuation-porosity relationships. They demonstrate that terrigenous, calcareous, and diatomaceous sediments cover different velocity- and attenuation-porosity ranges. In terrigcnous sediments, P wave vclocities and attenuation coefficients decrease rapidly with increasing porosity, whereas S wave velocities and shear moduli are very low. Calcareous sediments behave similarly at relatively higher porosities. Foraminifera skeletons in compositions of terrigenous mud and calcareous ooze cause a stiffening of the frame accompanied by higher shear moduli, P wave velocities, and attenuation coefficients. In diatomaceous ooze the contribution of the shear modulus becomes increasingly important and is controlled by the opal content, whereas attenuation is very low. This leads to the opportunity to predict the opal content from nondestructive P wave velocity measurements at centimeter-scale resolution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present the stellar calibrator sample and the conversion from instrumental to physical units for the 24 μm channel of the Multiband Imaging Photometer for Spitzer (MIPS). The primary calibrators are A stars, and the calibration factor based on those stars is 4.54 × 10^-2 MJy sr^–1 (DN/s)^–1, with a nominal uncertainty of 2%. We discuss the data reduction procedures required to attain this accuracy; without these procedures, the calibration factor obtained using the automated pipeline at the Spitzer Science Center is 1.6% ± 0.6% lower. We extend this work to predict 24 μm flux densities for a sample of 238 stars that covers a larger range of flux densities and spectral types. We present a total of 348 measurements of 141 stars at 24 μm. This sample covers a factor of ~460 in 24 μm flux density, from 8.6 mJy up to 4.0 Jy. We show that the calibration is linear over that range with respect to target flux and background level. The calibration is based on observations made using 3 s exposures; a preliminary analysis shows that the calibration factor may be 1% and 2% lower for 10 and 30 s exposures, respectively. We also demonstrate that the calibration is very stable: over the course of the mission, repeated measurements of our routine calibrator, HD 159330, show a rms scatter of only 0.4%. Finally, we show that the point-spread function (PSF) is well measured and allows us to calibrate extended sources accurately; Infrared Astronomy Satellite (IRAS) and MIPS measurements of a sample of nearby galaxies are identical within the uncertainties.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Transcription factors (TFs) control the temporal and spatial expression of target genes by interacting with DNA in a sequence-specific manner. Recent advances in high throughput experiments that measure TF-DNA interactions in vitro and in vivo have facilitated the identification of DNA binding sites for thousands of TFs. However, it remains unclear how each individual TF achieves its specificity, especially in the case of paralogous TFs that recognize distinct target genomic sites despite sharing very similar DNA binding motifs. In my work, I used a combination of high throughput in vitro protein-DNA binding assays and machine-learning algorithms to characterize and model the binding specificity of 11 paralogous TFs from 4 distinct structural families. My work proves that even very closely related paralogous TFs, with indistinguishable DNA binding motifs, oftentimes exhibit differential binding specificity for their genomic target sites, especially for sites with moderate binding affinity. Importantly, the differences I identify in vitro and through computational modeling help explain, at least in part, the differential in vivo genomic targeting by paralogous TFs. Future work will focus on in vivo factors that might also be important for specificity differences between paralogous TFs, such as DNA methylation, interactions with protein cofactors, or the chromatin environment. In this larger context, my work emphasizes the importance of intrinsic DNA binding specificity in targeting of paralogous TFs to the genome.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Spectral CT using a photon counting x-ray detector (PCXD) shows great potential for measuring material composition based on energy dependent x-ray attenuation. Spectral CT is especially suited for imaging with K-edge contrast agents to address the otherwise limited contrast in soft tissues. We have developed a micro-CT system based on a PCXD. This system enables full spectrum CT in which the energy thresholds of the PCXD are swept to sample the full energy spectrum for each detector element and projection angle. Measurements provided by the PCXD, however, are distorted due to undesirable physical eects in the detector and are very noisy due to photon starvation. In this work, we proposed two methods based on machine learning to address the spectral distortion issue and to improve the material decomposition. This rst approach is to model distortions using an articial neural network (ANN) and compensate for the distortion in a statistical reconstruction. The second approach is to directly correct for the distortion in the projections. Both technique can be done as a calibration process where the neural network can be trained using 3D printed phantoms data to learn the distortion model or the correction model of the spectral distortion. This replaces the need for synchrotron measurements required in conventional technique to derive the distortion model parametrically which could be costly and time consuming. The results demonstrate experimental feasibility and potential advantages of ANN-based distortion modeling and correction for more accurate K-edge imaging with a PCXD. Given the computational eciency with which the ANN can be applied to projection data, the proposed scheme can be readily integrated into existing CT reconstruction pipelines.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In an overcapacity world, where the customers can choose from many similar products to satisfy their needs, enterprises are looking for new approaches and tools that can help them not only to maintain, but also to increase their competitive edge. Innovation, flexibility, quality, and service excellence are required to, at the very least, survive the on-going transition that industry is experiencing from mass production to mass customization. In order to help these enterprises, this research develops a Supply Chain Capability Maturity Model named S(CM)2. The Supply Chain Capability Maturity Model is intended to model, analyze, and improve the supply chain management operations of an enterprise. The Supply Chain Capability Maturity Model provides a clear roadmap for enterprise improvement, covering multiple views and abstraction levels of the supply chain, and provides tools to aid the firm in making improvements. The principal research tool applied is the Delphi method, which systematically gathered the knowledge and experience of eighty eight experts in Mexico. The model is validated using a case study and interviews with experts in supply chain management. The resulting contribution is a holistic model of the supply chain integrating multiple perspectives, and providing a systematic procedure for the improvement of a company’s supply chain operations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the leading motivations behind the multilingual semantic web is to make resources accessible digitally in an online global multilingual context. Consequently, it is fundamental for knowledge bases to find a way to manage multilingualism and thus be equipped with those procedures for its conceptual modelling. In this context, the goal of this paper is to discuss how common-sense knowledge and cultural knowledge are modelled in a multilingual framework. More particularly, multilingualism and conceptual modelling are dealt with from the perspective of FunGramKB, a lexico-conceptual knowledge base for natural language understanding. This project argues for a clear division between the lexical and the conceptual dimensions of knowledge. Moreover, the conceptual layer is organized into three modules, which result from a strong commitment towards capturing semantic knowledge (Ontology), procedural knowledge (Cognicon) and episodic knowledge (Onomasticon). Cultural mismatches are discussed and formally represented at the three conceptual levels of FunGramKB.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-08

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-08