921 resultados para systems approach


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Knowledge management has become a promising method in supporting the clinicians′ decisions and improving the quality of medical services in the constantly changing clinical environment. However, current medical knowledge management systems cannot understand users′ requirements accurately and realize personalized matching. Therefore this paper proposes an ontological approach based on semiotic principles to personalized medical knowledge matching. In particular, healthcare domain knowledge is conceptualized and an ontology-based user profile is built. Furthmore, the personalized matching mechanism and algorithm are illustrated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose The research objective of this study is to understand how institutional changes to the EU regulatory landscape may affect corresponding institutionalized operational practices within financial organizations. Design/methodology/approach The study adopts an Investment Management System as its case and investigates different implementations of this system within eight financial organizations, predominantly focused on investment banking and asset management activities within capital markets. At the systems vendor site, senior systems consultants and client relationship managers were interviewed. Within the financial organizations, compliance, risk and systems experts were interviewed. Findings The study empirically tests modes of institutional change. Displacement and Layering were found to be the most prevalent modes. However, the study highlights how the outcomes of Displacement and Drift may be similar in effect as both modes may cause compliance gaps. The research highlights how changes in regulations may create gaps in systems and processes which, in the short term, need to be plugged by manual processes. Practical implications Vendors abilities to manage institutional change caused by Drift, Displacement, Layering and Conversion and their ability to efficiently and quickly translate institutional variables into structured systems has the power to ease the pain and cost of compliance as well as reducing the risk of breeches by reducing the need for interim manual systems. Originality/value The study makes a contribution by applying recent theoretical concepts of institutional change to the topic of regulatory change uses this analysis to provide insight into the effects of this new environment

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Performance modelling is a useful tool in the lifeycle of high performance scientific software, such as weather and climate models, especially as a means of ensuring efficient use of available computing resources. In particular, sufficiently accurate performance prediction could reduce the effort and experimental computer time required when porting and optimising a climate model to a new machine. In this paper, traditional techniques are used to predict the computation time of a simple shallow water model which is illustrative of the computation (and communication) involved in climate models. These models are compared with real execution data gathered on AMD Opteron-based systems, including several phases of the U.K. academic community HPC resource, HECToR. Some success is had in relating source code to achieved performance for the K10 series of Opterons, but the method is found to be inadequate for the next-generation Interlagos processor. The experience leads to the investigation of a data-driven application benchmarking approach to performance modelling. Results for an early version of the approach are presented using the shallow model as an example.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A lattice Boltzmann model able to simulate viscous fluid systems with elastic and movable boundaries is proposed. By introducing the virtual distribution function at the boundary, the Galilean invariance is recovered for the full system. As examples of application, the how in elastic vessels is simulated with the pressure-radius relationship similar to that of the pulmonary blood vessels. The numerical results for steady how are in good agreement with the analytical prediction, while the simulation results for pulsative how agree with the experimental observation of the aortic flows qualitatively. The approach has potential application in the study of the complex fluid systems such as the suspension system as well as the arterial blood flow.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we provide a connection between the geometrical properties of the attractor of a chaotic dynamical system and the distribution of extreme values. We show that the extremes of so-called physical observables are distributed according to the classical generalised Pareto distribution and derive explicit expressions for the scaling and the shape parameter. In particular, we derive that the shape parameter does not depend on the cho- sen observables, but only on the partial dimensions of the invariant measure on the stable, unstable, and neutral manifolds. The shape parameter is negative and is close to zero when high-dimensional systems are considered. This result agrees with what was derived recently using the generalized extreme value approach. Combining the results obtained using such physical observables and the properties of the extremes of distance observables, it is possible to derive estimates of the partial dimensions of the attractor along the stable and the unstable directions of the flow. Moreover, by writing the shape parameter in terms of moments of the extremes of the considered observable and by using linear response theory, we relate the sensitivity to perturbations of the shape parameter to the sensitivity of the moments, of the partial dimensions, and of the Kaplan–Yorke dimension of the attractor. Preliminary numer- ical investigations provide encouraging results on the applicability of the theory presented here. The results presented here do not apply for all combinations of Axiom A systems and observables, but the breakdown seems to be related to very special geometrical configurations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The complexity of current and emerging architectures provides users with options about how best to use the available resources, but makes predicting performance challenging. In this work a benchmark-driven model is developed for a simple shallow water code on a Cray XE6 system, to explore how deployment choices such as domain decomposition and core affinity affect performance. The resource sharing present in modern multi-core architectures adds various levels of heterogeneity to the system. Shared resources often includes cache, memory, network controllers and in some cases floating point units (as in the AMD Bulldozer), which mean that the access time depends on the mapping of application tasks, and the core's location within the system. Heterogeneity further increases with the use of hardware-accelerators such as GPUs and the Intel Xeon Phi, where many specialist cores are attached to general-purpose cores. This trend for shared resources and non-uniform cores is expected to continue into the exascale era. The complexity of these systems means that various runtime scenarios are possible, and it has been found that under-populating nodes, altering the domain decomposition and non-standard task to core mappings can dramatically alter performance. To find this out, however, is often a process of trial and error. To better inform this process, a performance model was developed for a simple regular grid-based kernel code, shallow. The code comprises two distinct types of work, loop-based array updates and nearest-neighbour halo-exchanges. Separate performance models were developed for each part, both based on a similar methodology. Application specific benchmarks were run to measure performance for different problem sizes under different execution scenarios. These results were then fed into a performance model that derives resource usage for a given deployment scenario, with interpolation between results as necessary.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The notion that learning can be enhanced when a teaching approach matches a learner’s learning style has been widely accepted in classroom settings since the latter represents a predictor of student’s attitude and preferences. As such, the traditional approach of ‘one-size-fits-all’ as may be applied to teaching delivery in Educational Hypermedia Systems (EHSs) has to be changed with an approach that responds to users’ needs by exploiting their individual differences. However, establishing and implementing reliable approaches for matching the teaching delivery and modalities to learning styles still represents an innovation challenge which has to be tackled. In this paper, seventy six studies are objectively analysed for several goals. In order to reveal the value of integrating learning styles in EHSs, different perspectives in this context are discussed. Identifying the most effective learning style models as incorporated within AEHSs. Investigating the effectiveness of different approaches for modelling students’ individual learning traits is another goal of this study. Thus, the paper highlights a number of theoretical and technical issues of LS-BAEHSs to serve as a comprehensive guidance for researchers who interest in this area.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The induction of classification rules from previously unseen examples is one of the most important data mining tasks in science as well as commercial applications. In order to reduce the influence of noise in the data, ensemble learners are often applied. However, most ensemble learners are based on decision tree classifiers which are affected by noise. The Random Prism classifier has recently been proposed as an alternative to the popular Random Forests classifier, which is based on decision trees. Random Prism is based on the Prism family of algorithms, which is more robust to noise. However, like most ensemble classification approaches, Random Prism also does not scale well on large training data. This paper presents a thorough discussion of Random Prism and a recently proposed parallel version of it called Parallel Random Prism. Parallel Random Prism is based on the MapReduce programming paradigm. The paper provides, for the first time, novel theoretical analysis of the proposed technique and in-depth experimental study that show that Parallel Random Prism scales well on a large number of training examples, a large number of data features and a large number of processors. Expressiveness of decision rules that our technique produces makes it a natural choice for Big Data applications where informed decision making increases the user’s trust in the system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Weather, climate, water and related environmental conditions, including air quality, all have profound effects on cities. A growing importance is being attached to understanding and predicting atmospheric conditions and their interactions with other components of the Earth System in cities, at multiple scales. We highlight the need for: (1) development of high-resolution coupled environmental prediction models that include realistic city-specific processes, boundary conditions and fluxes; (2) enhanced observational systems to support (force, constrain, evaluate) these models to provide high quality forecasts for new urban services; (3) provision of meteorological and related environmental variables to aid protection of human health and the environment; (4) new targeted and customized delivery platforms using modern communication techniques, developed with users to ensure that services, advice and warnings result in appropriate action; and (5) development of new skill and capacity to make best use of technologies to deliver new services in complex, challenging and evolving city environments. We highlight the importance of a coordinated and strategic approach that draws on, but does not replicate, past work to maximize benefits to stakeholders.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is little consensus on how agriculture will meet future food demands sustainably. Soils and their biota play a crucial role by mediating ecosystem services that support agricultural productivity. However, a multitude of site-specific environmental factors and management practices interact to affect the ability of soil biota to perform vital functions, confounding the interpretation of results from experimental approaches. Insights can be gained through models, which integrate the physiological, biological and ecological mechanisms underpinning soil functions. We present a powerful modelling approach for predicting how agricultural management practices (pesticide applications and tillage) affect soil functioning through earthworm populations. By combining energy budgets and individual-based simulation models, and integrating key behavioural and ecological drivers, we accurately predict population responses to pesticide applications in different climatic conditions. We use the model to analyse the ecological consequences of different weed management practices. Our results demonstrate that an important link between agricultural management (herbicide applications and zero, reduced and conventional tillage) and earthworms is the maintenance of soil organic matter (SOM). We show how zero and reduced tillage practices can increase crop yields while preserving natural ecosystem functions. This demonstrates how management practices which aim to sustain agricultural productivity should account for their effects on earthworm populations, as their proliferation stimulates agricultural productivity. Synthesis and applications. Our results indicate that conventional tillage practices have longer term effects on soil biota than pesticide control, if the pesticide has a short dissipation time. The risk of earthworm populations becoming exposed to toxic pesticides will be reduced under dry soil conditions. Similarly, an increase in soil organic matter could increase the recovery rate of earthworm populations. However, effects are not necessarily additive and the impact of different management practices on earthworms depends on their timing and the prevailing environmental conditions. Our model can be used to determine which combinations of crop management practices and climatic conditions pose least overall risk to earthworm populations. Linking our model mechanistically to crop yield models would aid the optimization of crop management systems by exploring the trade-off between different ecosystem services.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thiol-bearing microgels have been synthesised from copolymerisation of 2-(acetylthio)ethylacrylate and 2-hydroxyethylmethacrylate, and subsequent deprotection using sodium thiomethoxide. The concentration of thiol groups on these microgels could be tailored by use of different molar ratios of the two monomers. These thiol-bearing microgels were shown to adhere to ex vivo porcine urinary bladder, which was correlated with their level of thiolation. By simply mixing solutions of thiol-bearing microgels and doxorubicin, high levels of drug loading into the microgels could be achieved. Thiol-bearing microgels controlled the release of doxorubicin in a time-dependent manner over several hours. These doxorubicin-loaded thiol-bearing microgels could have application in the treatment of early-stage bladder cancers. The method used represents a new ‘bottom-up’ approach for the synthesis of novel mucoadhesive microgels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information can be interpreted as in-formation, which refers to the potential of the form for a mediation of meaning. In this paper we focus on reasoning information and consider the question how form involved in reasoning can be used for an analysis of accounting narratives in corporate disclosures. An evaluation of experimental results is included.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Geotechnical systems, such as landfills, mine tailings storage facilities (TSFs), slopes, and levees, are required to perform safely throughout their service life, which can span from decades for levees to “in perpetuity” for TSFs. The conventional design practice by geotechnical engineers for these systems utilizes the as-built material properties to predict its performance throughout the required service life. The implicit assumption in this design methodology is that the soil properties are stable through time. This is counter to long-term field observations of these systems, particularly where ecological processes such as plant, animal, biological, and geochemical activity are present. Plant roots can densify soil and/or increase hydraulic conductivity, burrowing animals can increase seepage, biological activity can strengthen soil, geochemical processes can increase stiffness, etc. The engineering soil properties naturally change as a stable ecological system is gradually established following initial construction, and these changes alter system performance. This paper presents an integrated perspective and new approach to this issue, considering ecological, geotechnical, and mining demands and constraints. A series of data sets and case histories are utilized to examine these issues and to propose a more integrated design approach, and consideration is given to future opportunities to manage engineered landscapes as ecological systems. We conclude that soil scientists and restoration ecologists must be engaged in initial project design and geotechnical engineers must be active in long-term management during the facility’s service life. For near-surface geotechnical structures in particular, this requires an interdisciplinary perspective and the embracing of soil as a living ecological system rather than an inert construction material.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background 29 autoimmune diseases, including Rheumatoid Arthritis, gout, Crohn’s Disease, and Systematic Lupus Erythematosus affect 7.6-9.4% of the population. While effective therapy is available, many patients do not follow treatment or use medications as directed. Digital health and Web 2.0 interventions have demonstrated much promise in increasing medication and treatment adherence, but to date many Internet tools have proven disappointing. In fact, most digital interventions continue to suffer from high attrition in patient populations, are burdensome for healthcare professionals, and have relatively short life spans. Objective Digital health tools have traditionally centered on the transformation of existing interventions (such as diaries, trackers, stage-based or cognitive behavioral therapy programs, coupons, or symptom checklists) to electronic format. Advanced digital interventions have also incorporated attributes of Web 2.0 such as social networking, text messaging, and the use of video. Despite these efforts, there has not been little measurable impact in non-adherence for illnesses that require medical interventions, and research must look to other strategies or development methodologies. As a first step in investigating the feasibility of developing such a tool, the objective of the current study is to systematically rate factors of non-adherence that have been reported in past research studies. Methods Grounded Theory, recognized as a rigorous method that facilitates the emergence of new themes through systematic analysis, data collection and coding, was used to analyze quantitative, qualitative and mixed method studies addressing the following autoimmune diseases: Rheumatoid Arthritis, gout, Crohn’s Disease, Systematic Lupus Erythematosus, and inflammatory bowel disease. Studies were only included if they contained primary data addressing the relationship with non-adherence. Results Out of the 27 studies, four non-modifiable and 11 modifiable risk factors were discovered. Over one third of articles identified the following risk factors as common contributors to medication non-adherence (percent of studies reporting): patients not understanding treatment (44%), side effects (41%), age (37%), dose regimen (33%), and perceived medication ineffectiveness (33%). An unanticipated finding that emerged was the need for risk stratification tools (81%) with patient-centric approaches (67%). Conclusions This study systematically identifies and categorizes medication non-adherence risk factors in select autoimmune diseases. Findings indicate that patients understanding of their disease and the role of medication are paramount. An unexpected finding was that the majority of research articles called for the creation of tailored, patient-centric interventions that dispel personal misconceptions about disease, pharmacotherapy, and how the body responds to treatment. To our knowledge, these interventions do not yet exist in digital format. Rather than adopting a systems level approach, digital health programs should focus on cohorts with heterogeneous needs, and develop tailored interventions based on individual non-adherence patterns.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multibiometrics aims at improving biometric security in presence of spoofing attempts, but exposes a larger availability of points of attack. Standard fusion rules have been shown to be highly sensitive to spoofing attempts – even in case of a single fake instance only. This paper presents a novel spoofing-resistant fusion scheme proposing the detection and elimination of anomalous fusion input in an ensemble of evidence with liveness information. This approach aims at making multibiometric systems more resistant to presentation attacks by modeling the typical behaviour of human surveillance operators detecting anomalies as employed in many decision support systems. It is shown to improve security, while retaining the high accuracy level of standard fusion approaches on the latest Fingerprint Liveness Detection Competition (LivDet) 2013 dataset.