939 resultados para Unified Lending


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Automatic generation of classification rules has been an increasingly popular technique in commercial applications such as Big Data analytics, rule based expert systems and decision making systems. However, a principal problem that arises with most methods for generation of classification rules is the overfit-ting of training data. When Big Data is dealt with, this may result in the generation of a large number of complex rules. This may not only increase computational cost but also lower the accuracy in predicting further unseen instances. This has led to the necessity of developing pruning methods for the simplification of rules. In addition, classification rules are used further to make predictions after the completion of their generation. As efficiency is concerned, it is expected to find the first rule that fires as soon as possible by searching through a rule set. Thus a suit-able structure is required to represent the rule set effectively. In this chapter, the authors introduce a unified framework for construction of rule based classification systems consisting of three operations on Big Data: rule generation, rule simplification and rule representation. The authors also review some existing methods and techniques used for each of the three operations and highlight their limitations. They introduce some novel methods and techniques developed by them recently. These methods and techniques are also discussed in comparison to existing ones with respect to efficient processing of Big Data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We describe some recent advances in the numerical solution of acoustic scattering problems. A major focus of the paper is the efficient solution of high frequency scattering problems via hybrid numerical-asymptotic boundary element methods. We also make connections to the unified transform method due to A. S. Fokas and co-authors, analysing particular instances of this method, proposed by J. A. De-Santo and co-authors, for problems of acoustic scattering by diffraction gratings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a method for the recognition of complex actions. Our method combines automatic learning of simple actions and manual definition of complex actions in a single grammar. Contrary to the general trend in complex action recognition that consists in dividing recognition into two stages, our method performs recognition of simple and complex actions in a unified way. This is performed by encoding simple action HMMs within the stochastic grammar that models complex actions. This unified approach enables a more effective influence of the higher activity layers into the recognition of simple actions which leads to a substantial improvement in the classification of complex actions. We consider the recognition of complex actions based on person transits between areas in the scene. As input, our method receives crossings of tracks along a set of zones which are derived using unsupervised learning of the movement patterns of the objects in the scene. We evaluate our method on a large dataset showing normal, suspicious and threat behaviour on a parking lot. Experiments show an improvement of ~ 30% in the recognition of both high-level scenarios and their composing simple actions with respect to a two-stage approach. Experiments with synthetic noise simulating the most common tracking failures show that our method only experiences a limited decrease in performance when moderate amounts of noise are added.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new coupled cloud physics–radiation parameterization of the bulk optical properties of ice clouds is presented. The parameterization is consistent with assumptions in the cloud physics scheme regarding particle size distributions (PSDs) and mass–dimensional relationships. The parameterization is based on a weighted ice crystal habit mixture model, and its bulk optical properties are parameterized as simple functions of wavelength and ice water content (IWC). This approach directly couples IWC to the bulk optical properties, negating the need for diagnosed variables, such as the ice crystal effective dimension. The parameterization is implemented into the Met Office Unified Model Global Atmosphere 5.0 (GA5) configuration. The GA5 configuration is used to simulate the annual 20-yr shortwave (SW) and longwave (LW) fluxes at the top of the atmosphere (TOA), as well as the temperature structure of the atmosphere, under various microphysical assumptions. The coupled parameterization is directly compared against the current operational radiation parameterization, while maintaining the same cloud physics assumptions. In this experiment, the impacts of the two parameterizations on the SW and LW radiative effects at TOA are also investigated and compared against observations. The 20-yr simulations are compared against the latest observations of the atmospheric temperature and radiative fluxes at TOA. The comparisons demonstrate that the choice of PSD and the assumed ice crystal shape distribution are as important as each other. Moreover, the consistent radiation parameterization removes a long-standing tropical troposphere cold temperature bias but slightly warms the southern midlatitudes by about 0.5 K.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We describe Global Atmosphere 3.0 (GA3.0): a configuration of the Met Office Unified Model (MetUM) developed for use across climate research and weather prediction activities. GA3.0 has been formulated by converging the development paths of the Met Office's weather and climate global atmospheric model components such that wherever possible, atmospheric processes are modelled or parametrized seamlessly across spatial resolutions and timescales. This unified development process will provide the Met Office and its collaborators with regular releases of a configuration that has been evaluated, and can hence be applied, over a variety of modelling régimes. We also describe Global Land 3.0 (GL3.0): a configuration of the JULES community land surface model developed for use with GA3.0.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Weather and climate model simulations of the West African Monsoon (WAM) have generally poor representation of the rainfall distribution and monsoon circulation because key processes, such as clouds and convection, are poorly characterized. The vertical distribution of cloud and precipitation during the WAM are evaluated in Met Office Unified Model simulations against CloudSat observations. Simulations were run at 40-km and 12-km horizontal grid length using a convection parameterization scheme and at 12-km, 4-km, and 1.5-km grid length with the convection scheme effectively switched off, to study the impact of model resolution and convection parameterization scheme on the organisation of tropical convection. Radar reflectivity is forward-modelled from the model cloud fields using the CloudSat simulator to present a like-with-like comparison with the CloudSat radar observations. The representation of cloud and precipitation at 12-km horizontal grid length improves dramatically when the convection parameterization is switched off, primarily because of a reduction in daytime (moist) convection. Further improvement is obtained when reducing model grid length to 4 km or 1.5 km, especially in the representation of thin anvil and mid-level cloud, but three issues remain in all model configurations. Firstly, all simulations underestimate the fraction of anvils with cloud top height above 12 km, which can be attributed to too low ice water contents in the model compared to satellite retrievals. Secondly, the model consistently detrains mid-level cloud too close to the freezing level, compared to higher altitudes in CloudSat observations. Finally, there is too much low-level cloud cover in all simulations and this bias was not improved when adjusting the rainfall parameters in the microphysics scheme. To improve model simulations of the WAM, more detailed and in-situ observations of the dynamics and microphysics targeting these non-precipitating cloud types are required.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper reviews the literature concerning the practice of using Online Analytical Processing (OLAP) systems to recall information stored by Online Transactional Processing (OLTP) systems. Such a review provides a basis for discussion on the need for the information that are recalled through OLAP systems to maintain the contexts of transactions with the data captured by the respective OLTP system. The paper observes an industry trend involving the use of OLTP systems to process information into data, which are then stored in databases without the business rules that were used to process information and data stored in OLTP databases without associated business rules. This includes the necessitation of a practice, whereby, sets of business rules are used to extract, cleanse, transform and load data from disparate OLTP systems into OLAP databases to support the requirements for complex reporting and analytics. These sets of business rules are usually not the same as business rules used to capture data in particular OLTP systems. The paper argues that, differences between the business rules used to interpret these same data sets, risk gaps in semantics between information captured by OLTP systems and information recalled through OLAP systems. Literature concerning the modeling of business transaction information as facts with context as part of the modelling of information systems were reviewed to identify design trends that are contributing to the design quality of OLTP and OLAP systems. The paper then argues that; the quality of OLTP and OLAP systems design has a critical dependency on the capture of facts with associated context, encoding facts with contexts into data with business rules, storage and sourcing of data with business rules, decoding data with business rules into the facts with the context and recall of facts with associated contexts. The paper proposes UBIRQ, a design model to aid the co-design of data with business rules storage for OLTP and OLAP purposes. The proposed design model provides the opportunity for the implementation and use of multi-purpose databases, and business rules stores for OLTP and OLAP systems. Such implementations would enable the use of OLTP systems to record and store data with executions of business rules, which will allow for the use of OLTP and OLAP systems to query data with business rules used to capture the data. Thereby ensuring information recalled via OLAP systems preserves the contexts of transactions as per the data captured by the respective OLTP system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bioaccessibility studies have been widely used as a research tool to determine the potential human exposure to ingested contaminants. More recently they have been practically applied for soil borne toxic elements. This paper reviews the application of bioaccessibility tests across a range of organic pollutants and contaminated matrices. Important factors are reported to be: the physiological relevance of the test, the components in the gut media, the size fraction chosen for the test and whether it contains a sorptive sink. The bioaccessibility is also a function of the composition of the matrix (e.g. organic carbon content of soils) and the physico-chemical characteristics of the pollutant under test. Despite the widespread use of these tests, there are a large number of formats used and very few validation studies with animal models. We propose a unified format for a bioaccessibility test for organic pollutants. The robustness of this test should first be confirmed through inter laboratory comparison, then tested in-vivo.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Evolved resistance to fungicides is a major problem limiting our ability to control agricultural, medical and veterinary pathogens and is frequently associated with substitutions in the amino acid sequence of the target protein. The convention for describing amino-acid substitutions is to cite the wild type amino acid, the codon number and the new amino acid, using the one letter amino acid code. It has frequently been observed that orthologous amino acid mutations have been selected in different species by fungicides from the same mode of action class, but the amino acids have different numbers. These differences in numbering arise from the different lengths of the proteins in each species. The purpose of the current paper is to propose a system for unifying the labelling of amino acids in fungicide target proteins. To do this we have produced alignments between fungicide target proteins of relevant species fitted to a well-studied “archetype” species. Orthologous amino acids in all species are then assigned numerical “labels” based on the position of the amino acid in the archetype protein.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Plant–Craig stochastic convection parameterization (version 2.0) is implemented in the Met Office Regional Ensemble Prediction System (MOGREPS-R) and is assessed in comparison with the standard convection scheme with a simple stochastic scheme only, from random parameter variation. A set of 34 ensemble forecasts, each with 24 members, is considered, over the month of July 2009. Deterministic and probabilistic measures of the precipitation forecasts are assessed. The Plant–Craig parameterization is found to improve probabilistic forecast measures, particularly the results for lower precipitation thresholds. The impact on deterministic forecasts at the grid scale is neutral, although the Plant–Craig scheme does deliver improvements when forecasts are made over larger areas. The improvements found are greater in conditions of relatively weak synoptic forcing, for which convective precipitation is likely to be less predictable.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective. Estimate cataract surgical rates (CSR) for Brazil and each federal unit in 2006 and 2007 based on the number of surgeries performed by the Unified Health System to help plan a comprehensive ophthalmology network in order to eliminate cataract blindness in compliance with the target set by the World Health Organization (WHO) of 3 000 cataract surgeries per million inhabitants per year. Methods. This descriptive study calculates CSR by using the number of cataract surgeries carried out by the Brazilian Unified Health System for each federal unit and estimates the need for cataract surgery in Brazil for 2006-2007, with official population data provided by the Brazilian Institute of Geography and Statistics. The number of cataract surgeries was compared with the WHO target. Results. To reach the WHO goal for eliminating age-related cataract blindness in Brazil, 560 312 cataract surgeries in 2006 and 568 006 surgeries in 2007 needed to be done. In 2006, 179 121 cataract surgeries were done by the Unified Health System, corresponding to a CSR of 959 per million population; in 2007, 223 317 were performed, with a CSR of 1 179. With the Brazilian Council of Ophthalmology estimation of 165 000 surgeries each year by the non-public services, the CSR for Brazil would be 1 842 for 2006 and 2 051 for 2007. The proportions needed to achieve the proposed target were 38.6% in 2006 and 31.6% in 2007. Conclusions. Human resources, technical expertise, and equipment are crucial to reach the WHO goal. Brazil has enough ophthalmologists but needs improved planning and infrastructure in order to eliminate the problem, aspects that require greater financial investment and stronger political commitment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we formulate a flexible density function from the selection mechanism viewpoint (see, for example, Bayarri and DeGroot (1992) and Arellano-Valle et al. (2006)) which possesses nice biological and physical interpretations. The new density function contains as special cases many models that have been proposed recently in the literature. In constructing this model, we assume that the number of competing causes of the event of interest has a general discrete distribution characterized by its probability generating function. This function has an important role in the selection procedure as well as in computing the conditional personal cure rate. Finally, we illustrate how various models can be deduced as special cases of the proposed model. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

During the last decade, the Internet usage has been growing at an enormous rate which has beenaccompanied by the developments of network applications (e.g., video conference, audio/videostreaming, E-learning, E-Commerce and real-time applications) and allows several types ofinformation including data, voice, picture and media streaming. While end-users are demandingvery high quality of service (QoS) from their service providers, network undergoes a complex trafficwhich leads the transmission bottlenecks. Considerable effort has been made to study thecharacteristics and the behavior of the Internet. Simulation modeling of computer networkcongestion is a profitable and effective technique which fulfills the requirements to evaluate theperformance and QoS of networks. To simulate a single congested link, simulation is run with asingle load generator while for a larger simulation with complex traffic, where the nodes are spreadacross different geographical locations generating distributed artificial loads is indispensable. Onesolution is to elaborate a load generation system based on master/slave architecture.