895 resultados para Model Driven Software Development, Arduino, Meta-Modeling, Domain Specific Languages, Software Factory


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Based on outcrop, borehole, seismic and regional geological data, the sequence stratigraphy, sedimentary facies of the Triassic in the western margin of the Zhugaer basin was studied, and favorable exploration target was forecasted. The major achievements include: (1) the Triassic in the western margin of the Zhugaer basin can be divided into 1 second-order sequence and 5 third-order sequences, which are, in ascending order, TSQ1, TSQ2, TSQ3, TSQ4, and TSQ5. TSQ1 is equivolent to Baikouquan formation, TSQ2 is equivolent to lower Kelamayi formation, TSQ3 is equivolent to upper Kelamai formation, TSQ4 is equivolent to lower and middle Baijiantan formation, and TSQ5 is equivolent to upper Baijiantan formation. Each sequence is divided into transgressive and regressive system tracts. Thus the sequence correlation framework is established. (2) The factors controlling development of sequences are analyzed, and it is believed that tectonic is the major controlling factor. Model of sequence development is summarized. (3)Through study on sedimentary facies, 6 types of facies are recognized: alluvial fan, fan delta, braided river, braided delta, delta and lake. Their microfacies are also recognized. In this study, it is proposed that the upper and lower Kelamayi formation(TSQ2、 TSQ3)is deposited by braided river instead of alluvial fan. This conclusion is of important theoretical and practical significance.(4) The sedimentary facies map of each sequence is compiled, and the sedimentary facies developed in each sequence is determined. In TSQ1, the sedimentary facies developed is alluvial fan and fan delta. In TSQ2, the sedimentary facies developed is mainly alluvial fan and fan delta in the north, and braided river and braided delta in the south. In TSQ3, the sedimentary facies developed is mainly braided river and braided delta. In TSQ4, the sedimentary facies developed is mainly braided delta in the north, and meandering delta in the south. In TSQ5, the sedimentary facies developed is mainly braided river and braided delta. (5) In the framework of sequence stratigrahpy, favorable areas for concealed traps are forecasted, and different types of traps are developed in different system tracts. (6) Favorable areas for future exploration are predicted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Global positioning system (GPS) can not only provide precise service for navigation and timing, but also be used to investigate the ionospheric variation. From the GPS observations, we can obtain total electron content (TEC), so-called GPS TEC, which is used to characterize the ionospheric structure. This thesis mainly concerns about GPS TEC data processing and ionospheric climatological analysis as follows. Firstly, develop an algorithm for high-resolution global ionospheric TEC mapping. According to current algorithms in global TEC mapping, we propose a practical way to calibrate the original GPS TEC with the existing GIM results. We also finish global/local TEC mapping by model fitting with the processed GPS TEC data; in practice, we apply it into the local TEC mapping in Southeast of China and obtain some initial results. Next, suggest a new method to calculate equivalent ionospheric global electron content (GEC). We calculate such an equivalent GEC with the TEC data along the geographic longitude 120°E. With the climatological analysis, we can see that GEC climatological variation is mainly composed of three factors: solar cycle, annual and semiannual variations. Solar cycle variation is dominant among them, which indicates the most prominent influence; both annual and semiannual variations play a secondary role and are modulated by solar activity. We construct an empirical GEC model driven by solar activity and seasonal factors on the basis of partial correlation analysis. Generally speaking, our researches not only show that GPS is advantageous in now-casting ionospheric TEC as an important observation, but also show that GEC may become a new index to describe the solar influence on the global ionosphere since the great correlation between GEC and solar activity factor indicates the close relationship between the ionosphere and solar activity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ABSTRACT Base on Dias-model, this paper study the forward modeling to IP response with MT souece and CSAMT source with the author’s program,that is to provide the theoretical basis for extracting the IP information from the far field, near field and transition field in extremely low frequency electromagnetic sounding with artificial strong fixed source signal. The outline of the paper is as follows. Firstly, review the history of the complex-resistivity-model, and analyses the bases for choosing Dias model; meanwhile, effects and responses of each parameter in Dias model are analyses. Afterward, study the forward modeling to 1D layered model with MT source, numerically simulate the IP effects of classic geoelectric objects; Find the clear anomaly and relationship between the peak value of the amplitude anomaly ratio and phase anomaly difference with and without IP parameters to be considered within a frequency range. On the basis of the modeling of MT, Study the CSAMT modeling with a dipole-source, obtain the anomaly responses and the relationship between anomaly and characters of object. Base on the infinite line source, study the 2D IP effect of geoelectric objects with 2D modeling. Calculate the response with different source distance, object depth and different wall rock resistivity systemically. Finally, conclude the viewpoints and give the discussion of the result, and point out the lack of this research also.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study is aimed to reveal macroscopic and microscopic anisotropism by using new theories, new methods and new technology. In order to reveal the forming mechanism and distribution pattern of remaining oil, flow units 4-dimension model and realistic model was established according the data over 20 years development of the Pucheng Oil field. Based on theories of multi-discipline subject, methods and technologies, by using correspondent 4-D data body and computer, combining quantity and quality study, static and development data, macroscopic and microscopic data, the two different geneses' reservoir, eg., braided delta and lake delta, are studied. The two different geneses' reservoir flow units models were established. Main achievement of this thesis are summarized as following: The standard of parameter optimization, identification and appreciation of two different geneses' reservoir were established. Based on the standard, the reservoir were classed into four flow units class as G,E,F and P. The flow unit static models of two different geneses' reservoir were established, and the relation of geometric shape, space distribution and macroscopic remaining oil was revealed. the flow units microscopic model were established, which tells that the changes of all the microscopic factor in the development. (4) Accordig BP arithmetic method, an adapt arithmetic method were designed, and the reservoir flow units were simulated based on the new method. (5) Reservoir realistic model of flow unit were established. Based on the model the microscopic development is simulated, which reveals the oil and water seepage in the reservoir and the mechanism of the microscopic oil formation. (6) The spatial residual oil distribution patterns were summarized. The remaided oil is mainly in the places as not being affected by the injected water, high part of the structures and the place near the sealed faults. There are 3 kinds and 9 distribution modes of microscopic remaining oil. The forming mechanism and distribution rule were pointed out. The study has developed a set of theories, technology and methods for flow units study, including flow units description, characterization and prediction. The study is also an improvement of the development geology theory in continental fault depression lake basin.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is a debate in cognitive development theory on whether cognitive development is general or specific. More and more researchers think that cognitive development is domain specific. People start to investigate preschoolers' native theory of human being's basic knowledge systems. Naive biology is one of the core domains. But there is argument whether there is separate native biological concepts among preschoolers. The research examined preschoolers' cognitive development of naive biological theory on two levels which is "growth" and "aliveness", and it also examined individual difference and factors that lead to the difference. Three studies were designed. Study 1 was to study preschoolers' cognition on growth, which is a basic trait of living things, and whether children can distinguish living and non-living things with the trait and understanding the causality. Study 2 was to investigate preschoolers' distinction between living things and non-living things from an integrated level. Study 3 was to investigate how children make inferences to unfamiliar things with their domain specific knowledge. The results showed the following: 1. Preschoolers gradually developed naive theory of biology on growth level, but their naive theory on integrated level has not developed. 2 Preschoolers' naive theory of biology is not "all or none", 4- and 5-year-old children showed some distinction between living and non-living things to some extent, they use non-intentional reason to explain the cause of growth and their explanation showed coherence. But growth has not been a criteria of ontological distinction of living and non-living things for 4- and 5-year-old children, most 6-year-old children can distinguish between living and non-living things, and these show the developing process of biological cognition. 3. Preschoolers' biological inference is influenced by their domain-specific knowledge, whether they can make inference to new trait of living things depends on whether they have specific knowledge. In the deductive task, children use their knowledge to make inference to unfamiliar things. 4-year-olds use concrete knowledge more often while the 6-year-old use generalized knowledge more frequency. 4. Preschoolers' knowledge grow with age, but individuals' cognitive development speed at different period. Urban and rural educational background affect cognitive performance. As time goes by, the urban-rural knowledge difference to distinguish living and nonliving things reduces. And preschoolers' are at the same developmental stage because the three age groups have similar causal explanation both in quantity and quality. 5. There is intra-individual difference on preschoolers' naive biological cognition. They show different performance on different tasks and domains, and their cognitive development is sequential, they understand growth earlier than they understand "alive", which is an integrated concept. The intra-individual differences decrease with age.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Srinivasan, A., King, R. D. and Bain, M.E. (2003) An Empirical Study of the Use of Relevance Information in Inductive Logic Programming. Journal of Machine Learning Research. 4(Jul):369-383

Relevância:

100.00% 100.00%

Publicador:

Resumo:

NetSketch is a tool that enables the specification of network-flow applications and the certification of desirable safety properties imposed thereon. NetSketch is conceived to assist system integrators in two types of activities: modeling and design. As a modeling tool, it enables the abstraction of an existing system so as to retain sufficient enough details to enable future analysis of safety properties. As a design tool, NetSketch enables the exploration of alternative safe designs as well as the identification of minimal requirements for outsourced subsystems. NetSketch embodies a lightweight formal verification philosophy, whereby the power (but not the heavy machinery) of a rigorous formalism is made accessible to users via a friendly interface. NetSketch does so by exposing tradeoffs between exactness of analysis and scalability, and by combining traditional whole-system analysis with a more flexible compositional analysis approach based on a strongly-typed, Domain-Specific Language (DSL) to specify network configurations at various levels of sketchiness along with invariants that need to be enforced thereupon. In this paper, we overview NetSketch, highlight its salient features, and illustrate how it could be used in applications, including the management/shaping of traffic flows in a vehicular network (as a proxy for CPS applications) and in a streaming media network (as a proxy for Internet applications). In a companion paper, we define the formal system underlying the operation of NetSketch, in particular the DSL behind NetSketch's user-interface when used in "sketch mode", and prove its soundness relative to appropriately-defined notions of validity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

NetSketch is a tool for the specification of constrained-flow applications and the certification of desirable safety properties imposed thereon. NetSketch is conceived to assist system integrators in two types of activities: modeling and design. As a modeling tool, it enables the abstraction of an existing system while retaining sufficient information about it to carry out future analysis of safety properties. As a design tool, NetSketch enables the exploration of alternative safe designs as well as the identification of minimal requirements for outsourced subsystems. NetSketch embodies a lightweight formal verification philosophy, whereby the power (but not the heavy machinery) of a rigorous formalism is made accessible to users via a friendly interface. NetSketch does so by exposing tradeoffs between exactness of analysis and scalability, and by combining traditional whole-system analysis with a more flexible compositional analysis. The compositional analysis is based on a strongly-typed Domain-Specific Language (DSL) for describing and reasoning about constrained-flow networks at various levels of sketchiness along with invariants that need to be enforced thereupon. In this paper, we define the formal system underlying the operation of NetSketch, in particular the DSL behind NetSketch's user-interface when used in "sketch mode", and prove its soundness relative to appropriately-defined notions of validity. In a companion paper [6], we overview NetSketch, highlight its salient features, and illustrate how it could be used in two applications: the management/shaping of traffic flows in a vehicular network (as a proxy for CPS applications) and in a streaming media network (as a proxy for Internet applications).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In research areas involving mathematical rigor, there are numerous benefits to adopting a formal representation of models and arguments: reusability, automatic evaluation of examples, and verification of consistency and correctness. However, accessibility has not been a priority in the design of formal verification tools that can provide these benefits. In earlier work [30] we attempt to address this broad problem by proposing several specific design criteria organized around the notion of a natural context: the sphere of awareness a working human user maintains of the relevant constructs, arguments, experiences, and background materials necessary to accomplish the task at hand. In this report we evaluate our proposed design criteria by utilizing within the context of novel research a formal reasoning system that is designed according to these criteria. In particular, we consider how the design and capabilities of the formal reasoning system that we employ influence, aid, or hinder our ability to accomplish a formal reasoning task – the assembly of a machine-verifiable proof pertaining to the NetSketch formalism. NetSketch is a tool for the specification of constrained-flow applications and the certification of desirable safety properties imposed thereon. NetSketch is conceived to assist system integrators in two types of activities: modeling and design. It provides capabilities for compositional analysis based on a strongly-typed domain-specific language (DSL) for describing and reasoning about constrained-flow networks and invariants that need to be enforced thereupon. In a companion paper [13] we overview NetSketch, highlight its salient features, and illustrate how it could be used in actual applications. In this paper, we define using a machine-readable syntax major parts of the formal system underlying the operation of NetSketch, along with its semantics and a corresponding notion of validity. We then provide a proof of soundness for the formalism that can be partially verified using a lightweight formal reasoning system that simulates natural contexts. A traditional presentation of these definitions and arguments can be found in the full report on the NetSketch formalism [12].

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nearest neighbor retrieval is the task of identifying, given a database of objects and a query object, the objects in the database that are the most similar to the query. Retrieving nearest neighbors is a necessary component of many practical applications, in fields as diverse as computer vision, pattern recognition, multimedia databases, bioinformatics, and computer networks. At the same time, finding nearest neighbors accurately and efficiently can be challenging, especially when the database contains a large number of objects, and when the underlying distance measure is computationally expensive. This thesis proposes new methods for improving the efficiency and accuracy of nearest neighbor retrieval and classification in spaces with computationally expensive distance measures. The proposed methods are domain-independent, and can be applied in arbitrary spaces, including non-Euclidean and non-metric spaces. In this thesis particular emphasis is given to computer vision applications related to object and shape recognition, where expensive non-Euclidean distance measures are often needed to achieve high accuracy. The first contribution of this thesis is the BoostMap algorithm for embedding arbitrary spaces into a vector space with a computationally efficient distance measure. Using this approach, an approximate set of nearest neighbors can be retrieved efficiently - often orders of magnitude faster than retrieval using the exact distance measure in the original space. The BoostMap algorithm has two key distinguishing features with respect to existing embedding methods. First, embedding construction explicitly maximizes the amount of nearest neighbor information preserved by the embedding. Second, embedding construction is treated as a machine learning problem, in contrast to existing methods that are based on geometric considerations. The second contribution is a method for constructing query-sensitive distance measures for the purposes of nearest neighbor retrieval and classification. In high-dimensional spaces, query-sensitive distance measures allow for automatic selection of the dimensions that are the most informative for each specific query object. It is shown theoretically and experimentally that query-sensitivity increases the modeling power of embeddings, allowing embeddings to capture a larger amount of the nearest neighbor structure of the original space. The third contribution is a method for speeding up nearest neighbor classification by combining multiple embedding-based nearest neighbor classifiers in a cascade. In a cascade, computationally efficient classifiers are used to quickly classify easy cases, and classifiers that are more computationally expensive and also more accurate are only applied to objects that are harder to classify. An interesting property of the proposed cascade method is that, under certain conditions, classification time actually decreases as the size of the database increases, a behavior that is in stark contrast to the behavior of typical nearest neighbor classification systems. The proposed methods are evaluated experimentally in several different applications: hand shape recognition, off-line character recognition, online character recognition, and efficient retrieval of time series. In all datasets, the proposed methods lead to significant improvements in accuracy and efficiency compared to existing state-of-the-art methods. In some datasets, the general-purpose methods introduced in this thesis even outperform domain-specific methods that have been custom-designed for such datasets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Grouping of collinear boundary contours is a fundamental process during visual perception. Illusory contour completion vividly illustrates how stable perceptual boundaries interpolate between pairs of contour inducers, but do not extrapolate from a single inducer. Neural models have simulated how perceptual grouping occurs in laminar visual cortical circuits. These models predicted the existence of grouping cells that obey a bipole property whereby grouping can occur inwardly between pairs or greater numbers of similarly oriented and co-axial inducers, but not outwardly from individual inducers. These models have not, however, incorporated spiking dynamics. Perceptual grouping is a challenge for spiking cells because its properties of collinear facilitation and analog sensitivity to inducer configurations occur despite irregularities in spike timing across all the interacting cells. Other models have demonstrated spiking dynamics in laminar neocortical circuits, but not how perceptual grouping occurs. The current model begins to unify these two modeling streams by implementing a laminar cortical network of spiking cells whose intracellular temporal dynamics interact with recurrent intercellular spiking interactions to quantitatively simulate data from neurophysiological experiments about perceptual grouping, the structure of non-classical visual receptive fields, and gamma oscillations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Motivated by accurate average-case analysis, MOdular Quantitative Analysis (MOQA) is developed at the Centre for Efficiency Oriented Languages (CEOL). In essence, MOQA allows the programmer to determine the average running time of a broad class of programmes directly from the code in a (semi-)automated way. The MOQA approach has the property of randomness preservation which means that applying any operation to a random structure, results in an output isomorphic to one or more random structures, which is key to systematic timing. Based on original MOQA research, we discuss the design and implementation of a new domain specific scripting language based on randomness preserving operations and random structures. It is designed to facilitate compositional timing by systematically tracking the distributions of inputs and outputs. The notion of a labelled partial order (LPO) is the basic data type in the language. The programmer uses built-in MOQA operations together with restricted control flow statements to design MOQA programs. This MOQA language is formally specified both syntactically and semantically in this thesis. A practical language interpreter implementation is provided and discussed. By analysing new algorithms and data restructuring operations, we demonstrate the wide applicability of the MOQA approach. Also we extend MOQA theory to a number of other domains besides average-case analysis. We show the strong connection between MOQA and parallel computing, reversible computing and data entropy analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Obesity has been defined as a consequence of energy imbalance, where energy intake exceeds energy expenditure and results in a build-up of adipose tissue. However, this scientific definition masks the complicated social meanings associated with the condition. This research investigated the construction of meaning around obesity at various levels of inquiry to inform how obesity is portrayed and understood in Ireland. A multi-paradigmatic approach was adopted, drawing on theory and methods from psychology and sociology and an analytical framework combining the Common Sense Model and framing theory was employed. In order to examine the exo-level meanings of obesity, content analysis was performed on two media data sets (n=479, n=346) and a thematic analysis was also performed on the multiple newspaper sample (n=346). At the micro-level, obesity discourses were investigated via the thematic analysis of comments sampled from an online message board. Finally, an online survey assessed individual-level beliefs and understandings of obesity. The media analysis revealed that individual blame for obesity was pervasive and the behavioural frame was dominant. A significant increase in attention to obesity over time was observed, manifestations of weight stigma were common, and there was an emotive discourse of blame directed towards the parents of obese children. The micro-level analysis provided insight into the weight-based stigma in society and a clear set of negative ‘default’ judgements accompanied the obese label. The survey analysis confirmed that the behavioural frame was the dominant means of understanding obesity. One of the strengths of this thesis is the link created between framing and the Common Sense Model in the development of an analytical framework for application in the examination of health/illness representations. This approach helped to ascertain the extent of the pervasive biomedical and individual blame discourse on obesity, which establishes the basis for the stigmatisation of obese persons.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the last two decades, semiconductor nanocrystals have been the focus of intense research due to their size dependant optical and electrical properties. Much is now known about how to control their size, shape, composition and surface chemistry, allowing fine control of their photophysical and electronic properties. However, genuine concerns have been raised regarding the heavy metal content of these materials, which is toxic even at relatively low concentrations and may limit their wide scale use. These concerns have driven the development of heavy metal free alternatives. In recent years, germanium nanocrystals (Ge NCs) have emerged as environmentally friendlier alternatives to II-VI and IV-VI semiconductor materials as they are nontoxic, biocompatible and electrochemically stable. This thesis reports the synthesis and characterisation of Ge NCs and their application as fluorescence probes for the detection of metal ions. A room-temperature method for the synthesis of size monodisperse Ge NCs within inverse micelles is reported, with well-defined core diameters that may be tuned from 3.5 to 4.5 nm. The Ge NCs are chemically passivated with amine ligands, minimising surface oxidation while rendering the NCs dispersible in a range of polar solvents. Regulation of the Ge NCs size is achieved by variation of the ammonium salts used to form the micelles. A maximum quantum yield of 20% is shown for the nanocrystals, and a transition from primarily blue to green emission is observed as the NC diameter increases from 3.5 to 4.5 nm. A polydisperse sample with a mixed emission profile is prepared and separated by centrifugation into individual sized NCs which each showed blue and green emission only, with total suppression of other emission colours. A new, efficient one step synthesis of Ge NCs with in situ passivation and straightforward purification steps is also reported. Ge NCs are formed by co-reduction of a mixture of GeCl4 and n-butyltrichlorogermane; the latter is used both as a capping ligand and a germanium source. The surface-bound layer of butyl chains both chemically passivates and stabilises the Ge NCs. Optical spectroscopy confirmed that these NCs are in the strong quantum confinement regime, with significant involvement of surface species in exciton recombination processes. The PL QY is determined to be 37 %, one of the highest values reported for organically terminated Ge NCs. A synthetic method is developed to produce size monodisperse Ge NCs with modified surface chemistries bearing carboxylic acid, acetate, amine and epoxy functional groups. The effect of these different surface terminations on the optical properties of the NCs is also studied. Comparison of the emission properties of these Ge NCs showed that the wavelength position of the PL maxima could be moved from the UV to the blue/green by choice of the appropriate surface group. We also report the application of water-soluble Ge NCs as a fluorescent sensing platform for the fast, highly selective and sensitive detection of Fe3+ ions. The luminescence quenching mechanism is confirmed by lifetime and absorbance spectroscopies, while the applicability of this assay for detection of Fe3+ in real water samples is investigated and found to satisfy the US Environmental Protection Agency requirements for Fe3+ levels in drinkable water supplies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mechanical factors play a crucial role in the development of articular cartilage in vivo. In this regard, tissue engineers have sought to leverage native mechanotransduction pathways to enhance in vitro stem cell-based cartilage repair strategies. However, a thorough understanding of how individual mechanical factors influence stem cell fate is needed to predictably and effectively utilize this strategy of mechanically-induced chondrogenesis. This article summarizes some of the latest findings on mechanically stimulated chondrogenesis, highlighting several new areas of interest, such as the effects of mechanical stimulation on matrix maintenance and terminal differentiation, as well as the use of multifactorial bioreactors. Additionally, the roles of individual biophysical factors, such as hydrostatic or osmotic pressure, are examined in light of their potential to induce mesenchymal stem cell chondrogenesis. An improved understanding of biomechanically-driven tissue development and maturation of stem cell-based cartilage replacements will hopefully lead to the development of cell-based therapies for cartilage degeneration and disease.