943 resultados para framework-intensive applications


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The three decades of on-going executives’ concerns of how to achieve successful alignment between business and information technology shows the complexity of such a vital process. Most of the challenges of alignment are related to knowledge and organisational change and several researchers have introduced a number of mechanisms to address some of these challenges. However, these mechanisms pay less attention to multi-level effects, which results in a limited un-derstanding of alignment across levels. Therefore, we reviewed these challenges from a multi-level learning perspective and found that business and IT alignment is related to the balance of exploitation and exploration strategies with the intellec-tual content of individual, group and organisational levels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increasing use of social media, applications or platforms that allow users to interact online, ensures that this environment will provide a useful source of evidence for the forensics examiner. Current tools for the examination of digital evidence find this data problematic as they are not designed for the collection and analysis of online data. Therefore, this paper presents a framework for the forensic analysis of user interaction with social media. In particular, it presents an inter-disciplinary approach for the quantitative analysis of user engagement to identify relational and temporal dimensions of evidence relevant to an investigation. This framework enables the analysis of large data sets from which a (much smaller) group of individuals of interest can be identified. In this way, it may be used to support the identification of individuals who might be ‘instigators’ of a criminal event orchestrated via social media, or a means of potentially identifying those who might be involved in the ‘peaks’ of activity. In order to demonstrate the applicability of the framework, this paper applies it to a case study of actors posting to a social media Web site.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A method of classifying the upper tropospheric/lower stratospheric (UTLS) jets has been developed that allows satellite and aircraft trace gas data and meteorological fields to be efficiently mapped in a jet coordinate view. A detailed characterization of multiple tropopauses accompanies the jet characterization. Jet climatologies show the well-known high altitude subtropical and lower altitude polar jets in the upper troposphere, as well as a pattern of concentric polar and subtropical jets in the Southern Hemisphere, and shifts of the primary jet to high latitudes associated with blocking ridges in Northern Hemisphere winter. The jet-coordinate view segregates air masses differently than the commonly-used equivalent latitude (EqL) coordinate throughout the lowermost stratosphere and in the upper troposphere. Mapping O3 data from the Aura Microwave Limb Sounder (MLS) satellite and the Winter Storms aircraft datasets in jet coordinates thus emphasizes different aspects of the circulation compared to an EqL-coordinate framework: the jet coordinate reorders the data geometrically, thus highlighting the strong PV, tropopause height and trace gas gradients across the subtropical jet, whereas EqL is a dynamical coordinate that may blur these spatial relationships but provides information on irreversible transport. The jet coordinate view identifies the concentration of stratospheric ozone well below the tropopause in the region poleward of and below the jet core, as well as other transport features associated with the upper tropospheric jets. Using the jet information in EqL coordinates allows us to study trace gas distributions in regions of weak versus strong jets, and demonstrates weaker transport barriers in regions with less jet influence. MLS and Atmospheric Chemistry Experiment-Fourier Transform Spectrometer trace gas fields for spring 2008 in jet coordinates show very strong, closely correlated, PV, tropopause height and trace gas gradients across the jet, and evidence of intrusions of stratospheric air below the tropopause below and poleward of the subtropical jet; these features are consistent between instruments and among multiple trace gases. Our characterization of the jets is facilitating studies that will improve our understanding of upper tropospheric trace gas evolution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Automatic generation of classification rules has been an increasingly popular technique in commercial applications such as Big Data analytics, rule based expert systems and decision making systems. However, a principal problem that arises with most methods for generation of classification rules is the overfit-ting of training data. When Big Data is dealt with, this may result in the generation of a large number of complex rules. This may not only increase computational cost but also lower the accuracy in predicting further unseen instances. This has led to the necessity of developing pruning methods for the simplification of rules. In addition, classification rules are used further to make predictions after the completion of their generation. As efficiency is concerned, it is expected to find the first rule that fires as soon as possible by searching through a rule set. Thus a suit-able structure is required to represent the rule set effectively. In this chapter, the authors introduce a unified framework for construction of rule based classification systems consisting of three operations on Big Data: rule generation, rule simplification and rule representation. The authors also review some existing methods and techniques used for each of the three operations and highlight their limitations. They introduce some novel methods and techniques developed by them recently. These methods and techniques are also discussed in comparison to existing ones with respect to efficient processing of Big Data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We discuss the modelling of dielectric responses of amorphous biological samples. Such samples are commonly encountered in impedance spectroscopy studies as well as in UV, IR, optical and THz transient spectroscopy experiments and in pump-probe studies. In many occasions, the samples may display quenched absorption bands. A systems identification framework may be developed to provide parsimonious representations of such responses. To achieve this, it is appropriate to augment the standard models found in the identification literature to incorporate fractional order dynamics. Extensions of models using the forward shift operator, state space models as well as their non-linear Hammerstein-Wiener counterpart models are highlighted. We also discuss the need to extend the theory of electromagnetically excited networks which can account for fractional order behaviour in the non-linear regime by incorporating nonlinear elements to account for the observed non-linearities. The proposed approach leads to the development of a range of new chemometrics tools for biomedical data analysis and classification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Regional climate downscaling has arrived at an important juncture. Some in the research community favour continued refinement and evaluation of downscaling techniques within a broader framework of uncertainty characterisation and reduction. Others are calling for smarter use of downscaling tools, accepting that conventional, scenario-led strategies for adaptation planning have limited utility in practice. This paper sets out the rationale and new functionality of the Decision Centric (DC) version of the Statistical DownScaling Model (SDSM-DC). This tool enables synthesis of plausible daily weather series, exotic variables (such as tidal surge), and climate change scenarios guided, not determined, by climate model output. Two worked examples are presented. The first shows how SDSM-DC can be used to reconstruct and in-fill missing records based on calibrated predictor-predictand relationships. Daily temperature and precipitation series from sites in Africa, Asia and North America are deliberately degraded to show that SDSM-DC can reconstitute lost data. The second demonstrates the application of the new scenario generator for stress testing a specific adaptation decision. SDSM-DC is used to generate daily precipitation scenarios to simulate winter flooding in the Boyne catchment, Ireland. This sensitivity analysis reveals the conditions under which existing precautionary allowances for climate change might be insufficient. We conclude by discussing the wider implications of the proposed approach and research opportunities presented by the new tool.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A truly variance-minimizing filter is introduced and its per for mance is demonstrated with the Korteweg– DeV ries (KdV) equation and with a multilayer quasigeostrophic model of the ocean area around South Africa. It is recalled that Kalman-like filters are not variance minimizing for nonlinear model dynamics and that four - dimensional variational data assimilation (4DV AR)-like methods relying on per fect model dynamics have dif- ficulty with providing error estimates. The new method does not have these drawbacks. In fact, it combines advantages from both methods in that it does provide error estimates while automatically having balanced states after analysis, without extra computations. It is based on ensemble or Monte Carlo integrations to simulate the probability density of the model evolution. When obser vations are available, the so-called importance resampling algorithm is applied. From Bayes’ s theorem it follows that each ensemble member receives a new weight dependent on its ‘ ‘distance’ ’ t o the obser vations. Because the weights are strongly var ying, a resampling of the ensemble is necessar y. This resampling is done such that members with high weights are duplicated according to their weights, while low-weight members are largely ignored. In passing, it is noted that data assimilation is not an inverse problem by nature, although it can be for mulated that way . Also, it is shown that the posterior variance can be larger than the prior if the usual Gaussian framework is set aside. However , i n the examples presented here, the entropy of the probability densities is decreasing. The application to the ocean area around South Africa, gover ned by strongly nonlinear dynamics, shows that the method is working satisfactorily . The strong and weak points of the method are discussed and possible improvements are proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

General principles of climate change adaptation for biodiversity have been formulated, but do not help prioritize actions. This is inhibiting their integration into conservation planning. We address this need with a decision framework that identifies and prioritizes actions to increase the adaptive capacity of species. The framework classifies species according to their current distribution and projected future climate space, as a basis for selecting appropriate decision trees. Decisions rely primarily on expert opinion, with additional information from quantitative models, where data are available. The framework considers in-situ management, followed by interventions at the landscape scale and finally translocation or ex-situ conservation. Synthesis and applications: From eight case studies, the key interventions identified for integrating climate change adaptation into conservation planning were local management and expansion of sites. We anticipate that, in combination with consideration of socio-economic and local factors, the decision framework will be a useful tool for conservation and natural resource managers to integrate adaptation measures into conservation plans.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In many data sets from clinical studies there are patients insusceptible to the occurrence of the event of interest. Survival models which ignore this fact are generally inadequate. The main goal of this paper is to describe an application of the generalized additive models for location, scale, and shape (GAMLSS) framework to the fitting of long-term survival models. in this work the number of competing causes of the event of interest follows the negative binomial distribution. In this way, some well known models found in the literature are characterized as particular cases of our proposal. The model is conveniently parameterized in terms of the cured fraction, which is then linked to covariates. We explore the use of the gamlss package in R as a powerful tool for inference in long-term survival models. The procedure is illustrated with a numerical example. (C) 2009 Elsevier Ireland Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We review several asymmetrical links for binary regression models and present a unified approach for two skew-probit links proposed in the literature. Moreover, under skew-probit link, conditions for the existence of the ML estimators and the posterior distribution under improper priors are established. The framework proposed here considers two sets of latent variables which are helpful to implement the Bayesian MCMC approach. A simulation study to criteria for models comparison is conducted and two applications are made. Using different Bayesian criteria we show that, for these data sets, the skew-probit links are better than alternative links proposed in the literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work we study, in the framework of Colombeau`s generalized functions, the Hamilton-Jacobi equation with a given initial condition. We have obtained theorems on existence of solutions and in some cases uniqueness. Our technique is adapted from the classical method of characteristics with a wide use of generalized functions. We were led also to obtain some general results on invertibility and also on ordinary differential equations of such generalized functions. (C) 2011 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Approximate Lie symmetries of the Navier-Stokes equations are used for the applications to scaling phenomenon arising in turbulence. In particular, we show that the Lie symmetries of the Euler equations are inherited by the Navier-Stokes equations in the form of approximate symmetries that allows to involve the Reynolds number dependence into scaling laws. Moreover, the optimal systems of all finite-dimensional Lie subalgebras of the approximate symmetry transformations of the Navier-Stokes are constructed. We show how the scaling groups obtained can be used to introduce the Reynolds number dependence into scaling laws explicitly for stationary parallel turbulent shear flows. This is demonstrated in the framework of a new approach to derive scaling laws based on symmetry analysis [11]-[13].

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Single-page applications have historically been subject to strong market forces driving fast development and deployment in lieu of quality control and changeable code, which are important factors for maintainability. In this report we develop two functionally equivalent applications using AngularJS and React and compare their maintainability as defined by ISO/IEC 9126. AngularJS and React represent two distinct approaches to web development, with AngularJS being a general framework providing rich base functionality and React a small specialized library for efficient view rendering. The quality comparison was accomplished by calculating Maintainability Index for each application. Version control analysis was used to determine quality indicators during development and subsequent maintenance where new functionality was added in two steps.   The results show no major differences in maintainability in the initial applications. As more functionality is added the Maintainability Index decreases faster in the AngularJS application, indicating a steeper increase in complexity compared to the React application. Source code analysis reveals that changes in data flow requires significantly larger modifications of the AngularJS application due to its inherent architecture for data flow. We conclude that frameworks are useful when they facilitate development of known requirements but less so when applications and systems grow in size.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper illustrates the use of the marginal cost of public funds concept in three contexts. First, we extend Parry’s (2003) analysis of the efficiency effects excise taxes in the U.K., primarily by incorporating the distortion caused by imperfect competition in the cigarette market and distinguishing between the MCFs for per unit and ad valorem taxes on cigarettes. Our computations show, contrary to the standard result in the literature, that the per unit tax on cigarettes has a slightly lower MCF than the ad valorem tax on cigarettes. Second, we calculate the MCF for a payroll tax in a labour market with involuntary unemployment, using the Shapiro and Stiglitz (1984) efficiency wage model as our framework. Our computations, based on Canadian labour market data, indicate that incorporating the distortion caused by involuntary unemployment raises the MCF by 25 to 50 percent. Third, we derive expressions for the distributionally-weighted MCFs for the exemption level and the marginal tax rate for a “flat tax”, such as the one that has been adopted by the province of Alberta. This allows us to develop a restricted, but tractable, version of the optimal income tax problem. Computations indicate that the optimal marginal tax rate may be quite high, even with relatively modest pro-poor distributional preferences.