739 resultados para Integrated modular avionics


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Prism family of algorithms induces modular classification rules in contrast to the Top Down Induction of Decision Trees (TDIDT) approach which induces classification rules in the intermediate form of a tree structure. Both approaches achieve a comparable classification accuracy. However in some cases Prism outperforms TDIDT. For both approaches pre-pruning facilities have been developed in order to prevent the induced classifiers from overfitting on noisy datasets, by cutting rule terms or whole rules or by truncating decision trees according to certain metrics. There have been many pre-pruning mechanisms developed for the TDIDT approach, but for the Prism family the only existing pre-pruning facility is J-pruning. J-pruning not only works on Prism algorithms but also on TDIDT. Although it has been shown that J-pruning produces good results, this work points out that J-pruning does not use its full potential. The original J-pruning facility is examined and the use of a new pre-pruning facility, called Jmax-pruning, is proposed and evaluated empirically. A possible pre-pruning facility for TDIDT based on Jmax-pruning is also discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Advances in hardware and software in the past decade allow to capture, record and process fast data streams at a large scale. The research area of data stream mining has emerged as a consequence from these advances in order to cope with the real time analysis of potentially large and changing data streams. Examples of data streams include Google searches, credit card transactions, telemetric data and data of continuous chemical production processes. In some cases the data can be processed in batches by traditional data mining approaches. However, in some applications it is required to analyse the data in real time as soon as it is being captured. Such cases are for example if the data stream is infinite, fast changing, or simply too large in size to be stored. One of the most important data mining techniques on data streams is classification. This involves training the classifier on the data stream in real time and adapting it to concept drifts. Most data stream classifiers are based on decision trees. However, it is well known in the data mining community that there is no single optimal algorithm. An algorithm may work well on one or several datasets but badly on others. This paper introduces eRules, a new rule based adaptive classifier for data streams, based on an evolving set of Rules. eRules induces a set of rules that is constantly evaluated and adapted to changes in the data stream by adding new and removing old rules. It is different from the more popular decision tree based classifiers as it tends to leave data instances rather unclassified than forcing a classification that could be wrong. The ongoing development of eRules aims to improve its accuracy further through dynamic parameter setting which will also address the problem of changing feature domain values.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Farming freshwater prawns with fish in rice fields is widespread in coastal regions of southwest Bangladesh because of favourable resources and ecological conditions. This article provides an overview of an ecosystem-based approach to integrated prawn-fish-rice farming in southwest Bangladesh. The practice of prawn and fish farming in rice fields is a form of integrated aquaculture-agriculture, which provides a wide range of social, economic and environmental benefits. Integrated prawn-fish-rice farming plays an important role in the economy of Bangladesh, earning foreign exchange and increasing food production. However, this unique farming system in coastal Bangladesh is particularly vulnerable to climatechange. We suggest that community-based adaptation strategies must be developed to cope with the challenges. We propose that integrated prawn-fish-rice farming could be relocated from the coastal region to less vulnerable upland areas, but caution that this will require appropriate adaptation strategies and an enabling institutional environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Data quality is a difficult notion to define precisely, and different communities have different views and understandings of the subject. This causes confusion, a lack of harmonization of data across communities and omission of vital quality information. For some existing data infrastructures, data quality standards cannot address the problem adequately and cannot fulfil all user needs or cover all concepts of data quality. In this study, we discuss some philosophical issues on data quality. We identify actual user needs on data quality, review existing standards and specifications on data quality, and propose an integrated model for data quality in the field of Earth observation (EO). We also propose a practical mechanism for applying the integrated quality information model to a large number of datasets through metadata inheritance. While our data quality management approach is in the domain of EO, we believe that the ideas and methodologies for data quality management can be applied to wider domains and disciplines to facilitate quality-enabled scientific research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We describe a one-port de-embedding technique suitable for the quasi-optical characterization of terahertz integrated components at frequencies beyond the operational range of most vector network analyzers. This technique is also suitable when the manufacturing of precision terminations to sufficiently fine tolerances for the application of a TRL de-embedding technique is not possible. The technique is based on vector reflection measurements of a series of easily realizable test pieces. A theoretical analysis is presented for the precision of the technique when implemented using a quasi-optical null-balanced bridge reflectometer. The analysis takes into account quantization effects in the linear and angular encoders associated with the balancing procedure, as well as source power and detector noise equivalent power. The precision in measuring waveguide characteristic impedance and attenuation using this de-embedding technique is further analyzed after taking into account changes in the power coupled due to axial, rotational, and lateral alignment errors between the device under test and the instruments' test port. The analysis is based on the propagation of errors after assuming imperfect coupling of two fundamental Gaussian beams. The required precision in repositioning the samples at the instruments' test-port is discussed. Quasi-optical measurements using the de-embedding process for a WR-8 adjustable precision short at 125 GHz are presented. The de-embedding methodology may be extended to allow the determination of S-parameters of arbitrary two-port junctions. The measurement technique proposed should prove most useful above 325 GHz where there is a lack of measurement standards.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We develop a new sparse kernel density estimator using a forward constrained regression framework, within which the nonnegative and summing-to-unity constraints of the mixing weights can easily be satisfied. Our main contribution is to derive a recursive algorithm to select significant kernels one at time based on the minimum integrated square error (MISE) criterion for both the selection of kernels and the estimation of mixing weights. The proposed approach is simple to implement and the associated computational cost is very low. Specifically, the complexity of our algorithm is in the order of the number of training data N, which is much lower than the order of N2 offered by the best existing sparse kernel density estimators. Numerical examples are employed to demonstrate that the proposed approach is effective in constructing sparse kernel density estimators with comparable accuracy to those of the classical Parzen window estimate and other existing sparse kernel density estimators.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Details are given of the development and application of a 2D depth-integrated, conformal boundary-fitted, curvilinear model for predicting the depth-mean velocity field and the spatial concentration distribution in estuarine and coastal waters. A numerical method for conformal mesh generation, based on a boundary integral equation formulation, has been developed. By this method a general polygonal region with curved edges can be mapped onto a regular polygonal region with the same number of horizontal and vertical straight edges and a multiply connected region can be mapped onto a regular region with the same connectivity. A stretching transformation on the conformally generated mesh has also been used to provide greater detail where it is needed close to the coast, with larger mesh sizes further offshore, thereby minimizing the computing effort whilst maximizing accuracy. The curvilinear hydrodynamic and solute model has been developed based on a robust rectilinear model. The hydrodynamic equations are approximated using the ADI finite difference scheme with a staggered grid and the solute transport equation is approximated using a modified QUICK scheme. Three numerical examples have been chosen to test the curvilinear model, with an emphasis placed on complex practical applications

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Clinical pathway is an approach to standardise care processes to support the implementations of clinical guidelines and protocols. It is designed to support the management of treatment processes including clinical and non-clinical activities, resources and also financial aspects. It provides detailed guidance for each stage in the management of a patient with the aim of improving the continuity and coordination of care across different disciplines and sectors. However, in the practical treatment process, the lack of knowledge sharing and information accuracy of paper-based clinical pathways burden health-care staff with a large amount of paper work. This will often result in medical errors, inefficient treatment process and thus poor quality medical services. This paper first presents a theoretical underpinning and a co-design research methodology for integrated pathway management by drawing input from organisational semiotics. An approach to integrated clinical pathway management is then proposed, which aims to embed pathway knowledge into treatment processes and existing hospital information systems. The capability of this approach has been demonstrated through the case study in one of the largest hospitals in China. The outcome reveals that medical quality can be improved significantly by the classified clinical pathway knowledge and seamless integration with hospital information systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The old paradigm that Amazonia's tropical ecosystems prevented cultural development beyond small-scale shifting agricultural economies, that had little environmental impact, no longer holds true for much of Amazonia. A diversity of archaeological evidence, including terra preta soils, raised fields, causeways, large habitation mounds, geometric earthworks, and megalithic monuments, all point to considerable cultural complexity and environmental impacts. However, uncertainty remains over the chronology of these cultures, their diet and economy, and the scale of environmental impact and land use associated with them. Here, we argue that a cross-disciplinary approach, closely coupling palaeoecology and archaeology, can potentially resolve these uncertainties. We show how, with careful site selection (pairing small and large lakes, close proximity to archaeological sites, transects of soil pits) and choice of techniques (e.g., pollen, phytoliths, starch grains, charcoal, stable isotopes), these two disciplines can be successfully integrated to provide a powerful tool for investigating the relationship between pre-Columbian cultures and their environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study investigates the child (L1) acquisition of properties at the interfaces of morpho-syntax, syntax-semantics and syntax-pragmatics, by focusing on inflected infinitives in European Portuguese (EP). Three child groups were tested, 6–7-year-olds, 9–10-year-olds and 11–12-year-olds, as well as an adult control group. The data demonstrate that children as young as 6 have knowledge of the morpho-syntactic properties of inflected infinitives, although they seem at first glance to show partially insufficient knowledge of their syntax–semantic interface properties (i.e. non-obligatory control properties), differently from children aged 9 and older, who show clearer evidence of knowledge of both types of properties. However, in general, both morpho-syntactic and syntax–semantics interface properties are also accessible to 6–7-year-old children, although these children give preference to a range of interpretations partially different from the adults; in certain cases, they may not appeal to certain pragmatic inferences that permit additional interpretations to adults and older children. Crucially, our data demonstrate that EP children master the two types of properties of inflected infinitives years before Brazilian Portuguese children do (Pires and Rothman, 2009a and Pires and Rothman, 2009b), reasons for and implications of which we discuss in detail.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To bridge the gaps between traditional mesoscale modelling and microscale modelling, the National Center for Atmospheric Research, in collaboration with other agencies and research groups, has developed an integrated urban modelling system coupled to the weather research and forecasting (WRF) model as a community tool to address urban environmental issues. The core of this WRF/urban modelling system consists of the following: (1) three methods with different degrees of freedom to parameterize urban surface processes, ranging from a simple bulk parameterization to a sophisticated multi-layer urban canopy model with an indoor–outdoor exchange sub-model that directly interacts with the atmospheric boundary layer, (2) coupling to fine-scale computational fluid dynamic Reynolds-averaged Navier–Stokes and Large-Eddy simulation models for transport and dispersion (T&D) applications, (3) procedures to incorporate high-resolution urban land use, building morphology, and anthropogenic heating data using the National Urban Database and Access Portal Tool (NUDAPT), and (4) an urbanized high-resolution land data assimilation system. This paper provides an overview of this modelling system; addresses the daunting challenges of initializing the coupled WRF/urban model and of specifying the potentially vast number of parameters required to execute the WRF/urban model; explores the model sensitivity to these urban parameters; and evaluates the ability of WRF/urban to capture urban heat islands, complex boundary-layer structures aloft, and urban plume T&D for several major metropolitan regions. Recent applications of this modelling system illustrate its promising utility, as a regional climate-modelling tool, to investigate impacts of future urbanization on regional meteorological conditions and on air quality under future climate change scenarios. Copyright © 2010 Royal Meteorological Society

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article reports the results of an experiment that examined how demand aggregators can discipline vertically-integrated firms - generator and distributor-retailer holdings-, which have a high share in wholesale electricity market with uniform price double auction (UPDA). We initially develop a treatment where holding members redistribute the profit based on the imposition of supra-competitive prices, in equal proportions (50%-50%). Subsequently, we introduce a vertical disintegration (unbundling) treatment with holding-s information sharing, where profits are distributed according to market outcomes. Finally, a third treatment is performed to introduce two active demand aggregators, with flexible interruptible loads in real time. We found that the introduction of responsive demand aggregators neutralizes the power market and increases market efficiency, even beyond what is achieved through vertical disintegration.