63 resultados para 387
Resumo:
Top Down Induction of Decision Trees (TDIDT) is the most commonly used method of constructing a model from a dataset in the form of classification rules to classify previously unseen data. Alternative algorithms have been developed such as the Prism algorithm. Prism constructs modular rules which produce qualitatively better rules than rules induced by TDIDT. However, along with the increasing size of databases, many existing rule learning algorithms have proved to be computational expensive on large datasets. To tackle the problem of scalability, parallel classification rule induction algorithms have been introduced. As TDIDT is the most popular classifier, even though there are strongly competitive alternative algorithms, most parallel approaches to inducing classification rules are based on TDIDT. In this paper we describe work on a distributed classifier that induces classification rules in a parallel manner based on Prism.
Resumo:
If acid-sensitive drugs or cells are administered orally, there is often a reduction in efficacy associated with gastric passage. Formulation into a polymer matrix is a potential method to improve their stability. The visualization of pH within these materials may help better understand the action of these polymer systems and allow comparison of different formulations. We herein describe the development of a novel confocal laser-scanning microscopy (CLSM) method for visualizing pH changes within polymer matrices and demonstrate its applicability to an enteric formulation based on chitosan-coated alginate gels. The system in question is first shown to protect an acid-sensitive bacterial strain to low pH, before being studied by our technique. Prior to this study, it has been claimed that protection by these materials is a result of buffering, but this has not been demonstrated. The visualization of pH within these matrices during exposure to a pH 2.0 simulated gastric solution showed an encroachment of acid from the periphery of the capsule, and a persistence of pHs above 2.0 within the matrix. This implies that the protective effect of the alginate-chitosan matrices is most likely due to a combination of buffering of acid as it enters the polymer matrix and the slowing of acid penetration.
Resumo:
Organisations need the right business and IT capabilities in order to achieve future business success. It follows that the sourcing of these capabilities is an important decision. Yet, there is a lack of consensus on the approach to decid-ing where and how to source the core operational capabilities. Furthermore, de-veloping its dynamic capability enables an organisation to effectively manage change its operational capabilities. Recent research has proposed that analysing business capabilities is a key pre-requisite to defining its Information Technology (IT) solutions. This research builds on these findings by considering the interde-pendencies between the dynamic business change capability and the sourcing of IT capabilities. Further it examines the decision-making oversight of these areas as implemented through IT governance. There is a good understanding of the direct impact of IT sourcing decision on operational capabilities However, there is a lack of research on the indirect impact to the capability of managing business change. Through a review of prior research and initial pilot field research, a capability framework and three main propositions are proposed, each examining a two-way interdependency. This paper describes the development of the integrated capa-bility framework and the rationale for the propositions. These respectively cover managing business change, IT sourcing and IT governance. Firstly, the sourcing of IT affects both the operational capabilities and the capability to manage business change. Similarly a business change may result in new or revised operational ca-pabilities, which can influence the IT sourcing decision resulting in a two-way rela-tionship. Secondly, this IT sourcing is directed under IT governance, which pro-vides a decision-making framework for the organisation. At the same time, the IT sourcing can have an impact on the IT governance capability, for example by out-sourcing key capabilities; hence this is potentially again a two-way relationship. Finally, there is a postulated two-way relationship between IT governance and managing business change in that IT governance provides an oversight of manag-ing business change through portfolio management while IT governance is a key element of the business change capability. Given the nature and novelty of this framework, a philosophical paradigm of constructivism is preferred. To illustrate and explore the theoretical perspectives provided, this paper reports on the find-ings of a case study incorporating eight high-level interviews with senior execu-tives in a German bank with 2300 employees. The collected data also include or-ganisational charts, annual reports, project and activity portfolio and benchmark reports for the IT budget. Recommendations are made for practitioners. An understanding of the interdependencies can support professionals in improving business success through effectively managing business change. Additionally, they can be assisted to evaluate the impact of IT sourcing decisions on the organisa-tion’s operational and dynamic capabilities, using an appropriate IT governance framework.
Resumo:
When people encounter emotional events, their memory for those events is typically enhanced. But it has been unclear how emotionally arousing events influence memory for preceding information. Does emotional arousal induce retrograde amnesia or retrograde enhancement? The current study revealed that this depends on the top-down goal relevance of the preceding information. Across three studies, we found that emotional arousal induced by one image facilitated memory for the preceding neutral item when people prioritized that neutral item. In contrast, an emotionally arousing image impaired memory for the preceding neutral item when people did not prioritize that neutral item. Emotional arousal elicited by both negative and positive pictures showed this pattern of enhancing or impairing memory for the preceding stimulus depending on its priority. These results indicate that emotional arousal amplifies the effects of top-down priority in memory formation.
Resumo:
Urban land surface models (LSM) are commonly evaluated for short periods (a few weeks to months) because of limited observational data. This makes it difficult to distinguish the impact of initial conditions on model performance or to consider the response of a model to a range of possible atmospheric conditions. Drawing on results from the first urban LSM comparison, these two issues are considered. Assessment shows that the initial soil moisture has a substantial impact on the performance. Models initialised with soils that are too dry are not able to adjust their surface sensible and latent heat fluxes to realistic values until there is sufficient rainfall. Models initialised with too wet soils are not able to restrict their evaporation appropriately for periods in excess of a year. This has implications for short term evaluation studies and implies the need for soil moisture measurements to improve data assimilation and model initialisation. In contrast, initial conditions influencing the thermal storage have a much shorter adjustment timescale compared to soil moisture. Most models partition too much of the radiative energy at the surface into the sensible heat flux at the probable expense of the net storage heat flux.
Resumo:
This paper presents an assessment of the implications of climate change for global river flood risk. It is based on the estimation of flood frequency relationships at a grid resolution of 0.5 × 0.5°, using a global hydrological model with climate scenarios derived from 21 climate models, together with projections of future population. Four indicators of the flood hazard are calculated; change in the magnitude and return period of flood peaks, flood-prone population and cropland exposed to substantial change in flood frequency, and a generalised measure of regional flood risk based on combining frequency curves with generic flood damage functions. Under one climate model, emissions and socioeconomic scenario (HadCM3 and SRES A1b), in 2050 the current 100-year flood would occur at least twice as frequently across 40 % of the globe, approximately 450 million flood-prone people and 430 thousand km2 of flood-prone cropland would be exposed to a doubling of flood frequency, and global flood risk would increase by approximately 187 % over the risk in 2050 in the absence of climate change. There is strong regional variability (most adverse impacts would be in Asia), and considerable variability between climate models. In 2050, the range in increased exposure across 21 climate models under SRES A1b is 31–450 million people and 59 to 430 thousand km2 of cropland, and the change in risk varies between −9 and +376 %. The paper presents impacts by region, and also presents relationships between change in global mean surface temperature and impacts on the global flood hazard. There are a number of caveats with the analysis; it is based on one global hydrological model only, the climate scenarios are constructed using pattern-scaling, and the precise impacts are sensitive to some of the assumptions in the definition and application.
Resumo:
The current state of the art in the planning and coordination of autonomous vehicles is based upon the presence of speed lanes. In a traffic scenario where there is a large diversity between vehicles the removal of speed lanes can generate a significantly higher traffic bandwidth. Vehicle navigation in such unorganized traffic is considered. An evolutionary based trajectory planning technique has the advantages of making driving efficient and safe, however it also has to surpass the hurdle of computational cost. In this paper, we propose a real time genetic algorithm with Bezier curves for trajectory planning. The main contribution is the integration of vehicle following and overtaking behaviour for general traffic as heuristics for the coordination between vehicles. The resultant coordination strategy is fast and near-optimal. As the vehicles move, uncertainties may arise which are constantly adapted to, and may even lead to either the cancellation of an overtaking procedure or the initiation of one. Higher level planning is performed by Dijkstra's algorithm which indicates the route to be followed by the vehicle in a road network. Re-planning is carried out when a road blockage or obstacle is detected. Experimental results confirm the success of the algorithm subject to optimal high and low-level planning, re-planning and overtaking.
Resumo:
This paper presents a novel approach to the automatic classification of very large data sets composed of terahertz pulse transient signals, highlighting their potential use in biochemical, biomedical, pharmaceutical and security applications. Two different types of THz spectra are considered in the classification process. Firstly a binary classification study of poly-A and poly-C ribonucleic acid samples is performed. This is then contrasted with a difficult multi-class classification problem of spectra from six different powder samples that although have fairly indistinguishable features in the optical spectrum, they also possess a few discernable spectral features in the terahertz part of the spectrum. Classification is performed using a complex-valued extreme learning machine algorithm that takes into account features in both the amplitude as well as the phase of the recorded spectra. Classification speed and accuracy are contrasted with that achieved using a support vector machine classifier. The study systematically compares the classifier performance achieved after adopting different Gaussian kernels when separating amplitude and phase signatures. The two signatures are presented as feature vectors for both training and testing purposes. The study confirms the utility of complex-valued extreme learning machine algorithms for classification of the very large data sets generated with current terahertz imaging spectrometers. The classifier can take into consideration heterogeneous layers within an object as would be required within a tomographic setting and is sufficiently robust to detect patterns hidden inside noisy terahertz data sets. The proposed study opens up the opportunity for the establishment of complex-valued extreme learning machine algorithms as new chemometric tools that will assist the wider proliferation of terahertz sensing technology for chemical sensing, quality control, security screening and clinic diagnosis. Furthermore, the proposed algorithm should also be very useful in other applications requiring the classification of very large datasets.
Resumo:
In this paper we investigate the equilibrium properties of magnetic dipolar (ferro-) fluids and discuss finite-size effects originating from the use of different boundary conditions in computer simulations. Both periodic boundary conditions and a finite spherical box are studied. We demonstrate that periodic boundary conditions and subsequent use of Ewald sum to account for the long-range dipolar interactions lead to a much faster convergence (in terms of the number of investigated dipolar particles) of the magnetization curve and the initial susceptibility to their thermodynamic limits. Another unwanted effect of the simulations in a finite spherical box geometry is a considerable sensitivity to the container size. We further investigate the influence of the surface term in the Ewald sum-that is, due to the surrounding continuum with magnetic permeability mu(BC)-on the convergence properties of our observables and on the final results. The two different ways of evaluating the initial susceptibility, i.e., (1) by the magnetization response of the system to an applied field and (2) by the zero-field fluctuation of the mean-square dipole moment of the system, are compared in terms of speed and accuracy.
Resumo:
Immunodiagnostic microneedles provide a novel way to extract protein biomarkers from the skin in a minimally invasive manner for analysis in vitro. The technology could overcome challenges in biomarker analysis specifically in solid tissue, which currently often involves invasive biopsies. This study describes the development of a multiplex immunodiagnostic device incorporating mechanisms to detect multiple antigens simultaneously, as well as internal assay controls for result validation. A novel detection method is also proposed. It enables signal detection specifically at microneedle tips and therefore may aid the construction of depth profiles of skin biomarkers. The detection method can be coupled with computerised densitometry for signal quantitation. The antigen specificity, sensitivity and functional stability of the device were assessed against a number of model biomarkers. Detection and analysis of endogenous antigens (interleukins 1α and 6) from the skin using the device was demonstrated. The results were verified using conventional enzyme-linked immunosorbent assays. The detection limit of the microneedle device, at ≤10 pg/mL, was at least comparable to conventional plate-based solid-phase enzyme immunoassays.
Resumo:
Tribe Merremieae, as currently circumscribed, comprise c. 120 species classified in seven genera, the largest of which (Merremia) is morphologically heterogeneous. Previous studies, with limited sampling, have suggested that neither Merremieae nor Merremia are monophyletic. In the present study, the monophyly of Merremia and its allied genera was re-assessed, sampling 57 species of Merremieae for the plastid matK, trnL–trnF and rps16 regions and the nuclear internal transcribed spacer (ITS) region. All genera of Merremieae and all major morphotypes in Merremia were represented. Phylogenetic analyses resolve Merremieae in a clade with Ipomoeae, Convolvuleae and Daustinia montana. Merremia is confirmed as polyphyletic and a number of well-supported and morphologically distinct clades in Merremieae are recognized which accommodate most of the species in the tribe. These provide a framework for a generic revision of the assemblage.