829 resultados para Multi-classifier systems
Resumo:
Robustness in multi-variable control system design requires that the solution to the design problem be insensitive to perturbations in the system data. In this paper we discuss measures of robustness for generalized state-space, or descriptor, systems and describe algorithmic techniques for optimizing robustness for various applications.
Resumo:
Business and IT alignment is increasingly acknowledged as a key for organisational performance. However, alignment research lack to mechanisms that enable for on-going process with multi-level effects. Multi-level learning allows on-going effectiveness through development of the organisation and improved quality of business and IT strategies. In particular, exploration and exploitation enable effective process of alignment across dynamic multi-level of learning. Hence, this paper proposes a conceptual framework that links multi-level learning and business-IT strategy through the concept of exploration and exploitation, which considers short-term and long-term alignment together to address the challenges of strategic alignment faced in sustaining organisational performance.
Resumo:
In data fusion systems, one often encounters measurements of past target locations and then wishes to deduce where the targets are currently located. Recent research on the processing of such out-of-sequence data has culminated in the development of a number of algorithms for solving the associated tracking problem. This paper reviews these different approaches in a common Bayesian framework and proposes an architecture that orthogonalises the data association and out-of-sequence problems such that any combination of solutions to these two problems can be used together. The emphasis is not on advocating one approach over another on the basis of computational expense, but rather on understanding the relationships between the algorithms so that any approximations made are explicit.
Resumo:
Diaminofluoresceins are widely used probes for detection and intracellular localization of NO formation in cultured/isolated cells and intact tissues. The fluorinated derivative, 4-amino-5-methylamino-2′,7′-difluorofluorescein (DAF-FM), has gained increasing popularity in recent years due to its improved NO-sensitivity, pH-stability, and resistance to photo-bleaching compared to the first-generation compound, DAF-2. Detection of NO production by either reagent relies on conversion of the parent compound into a fluorescent triazole, DAF-FM-T and DAF-2-T, respectively. While this reaction is specific for NO and/or reactive nitrosating species, it is also affected by the presence of oxidants/antioxidants. Moreover, the reaction with other molecules can lead to the formation of fluorescent products other than the expected triazole. Thus additional controls and structural confirmation of the reaction products are essential. Using human red blood cells as an exemplary cellular system we here describe robust protocols for the analysis of intracellular DAF-FM-T formation using an array of fluorescence-based methods (laser-scanning fluorescence microscopy, flow cytometry and fluorimetry) and analytical separation techniques (reversed-phase HPLC and LC-MS/MS). When used in combination, these assays afford unequivocal identification of the fluorescent signal as being derived from NO and are applicable to most other cellular systems without or with only minor modifications.
Resumo:
We present the first climate prediction of the coming decade made with multiple models, initialized with prior observations. This prediction accrues from an international activity to exchange decadal predictions in near real-time, in order to assess differences and similarities, provide a consensus view to prevent over-confidence in forecasts from any single model, and establish current collective capability. We stress that the forecast is experimental, since the skill of the multi-model system is as yet unknown. Nevertheless, the forecast systems used here are based on models that have undergone rigorous evaluation and individually have been evaluated for forecast skill. Moreover, it is important to publish forecasts to enable open evaluation, and to provide a focus on climate change in the coming decade. Initialized forecasts of the year 2011 agree well with observations, with a pattern correlation of 0.62 compared to 0.31 for uninitialized projections. In particular, the forecast correctly predicted La Niña in the Pacific, and warm conditions in the north Atlantic and USA. A similar pattern is predicted for 2012 but with a weaker La Niña. Indices of Atlantic multi-decadal variability and Pacific decadal variability show no signal beyond climatology after 2015, while temperature in the Niño3 region is predicted to warm slightly by about 0.5 °C over the coming decade. However, uncertainties are large for individual years and initialization has little impact beyond the first 4 years in most regions. Relative to uninitialized forecasts, initialized forecasts are significantly warmer in the north Atlantic sub-polar gyre and cooler in the north Pacific throughout the decade. They are also significantly cooler in the global average and over most land and ocean regions out to several years ahead. However, in the absence of volcanic eruptions, global temperature is predicted to continue to rise, with each year from 2013 onwards having a 50 % chance of exceeding the current observed record. Verification of these forecasts will provide an important opportunity to test the performance of models and our understanding and knowledge of the drivers of climate change.
Resumo:
Whilst hydrological systems can show resilience to short-term streamflow deficiencies during within-year droughts, prolonged deficits during multi-year droughts are a significant threat to water resources security in Europe. This study uses a threshold-based objective classification of regional hydrological drought to qualitatively examine the characteristics, spatio-temporal evolution and synoptic climatic drivers of multi-year drought events in 1962–64, 1975–76 and 1995–97, on a European scale but with particular focus on the UK. Whilst all three events are multi-year, pan-European phenomena, their development and causes can be contrasted. The critical factor in explaining the unprecedented severity of the 1975–76 event is the consecutive occurrence of winter and summer drought. In contrast, 1962–64 was a succession of dry winters, mitigated by quiescent summers, whilst 1995–97 lacked spatial coherence and was interrupted by wet interludes. Synoptic climatic conditions vary within and between multi-year droughts, suggesting that regional factors modulate the climate signal in streamflow drought occurrence. Despite being underpinned by qualitatively similar climatic conditions and commonalities in evolution and characteristics, each of the three droughts has a unique spatio-temporal signature. An improved understanding of the spatio-temporal evolution and characteristics of multi-year droughts has much to contribute to monitoring and forecasting capability, and to improved mitigation strategies.
Resumo:
The increasing use of drug combinations to treat disease states, such as cancer, calls for improved delivery systems that are able to deliver multiple agents. Herein, we report a series of novel Janus dendrimers with potential for use in combination therapy. Different generations (first and second) of PEG-based dendrons containing two different “model drugs”, benzyl alcohol (BA) and 3-phenylpropionic acid (PPA), were synthesized. BA and PPA were attached via two different linkers (carbonate and ester, respectively) to promote differential drug release. The four dendrons were coupled together via (3 + 2) cycloaddition chemistries to afford four Janus dendrimers, which contained varying amounts and different ratios of BA and PPA, namely, (BA)2-G1-G1-(PPA)2, (BA)4-G2-G1-(PPA)2, (BA)2-G1-G2-(PPA)4, and (BA)4-G2-G2-(PPA)4. Release studies in plasma showed that the dendrimers provided sequential release of the two model drugs, with BA being released faster than PPA from all of the dendrons. The different dendrimers allowed delivery of increasing amounts (0.15–0.30 mM) and in exact molecular ratios (1:2; 2:1; 1:2; 2:2) of the two model drug compounds. The dendrimers were noncytotoxic (100% viability at 1 mg/mL) toward human umbilical vein endothelial cells (HUVEC) and nontoxic toward red blood cells, as confirmed by hemolysis studies. These studies demonstrate that these Janus PEG-based dendrimers offer great potential for the delivery of drugs via combination therapy.
Resumo:
The three decades of on-going executives’ concerns of how to achieve successful alignment between business and information technology shows the complexity of such a vital process. Most of the challenges of alignment are related to knowledge and organisational change and several researchers have introduced a number of mechanisms to address some of these challenges. However, these mechanisms pay less attention to multi-level effects, which results in a limited un-derstanding of alignment across levels. Therefore, we reviewed these challenges from a multi-level learning perspective and found that business and IT alignment is related to the balance of exploitation and exploration strategies with the intellec-tual content of individual, group and organisational levels.
Resumo:
The planning of semi-autonomous vehicles in traffic scenarios is a relatively new problem that contributes towards the goal of making road travel by vehicles free of human drivers. An algorithm needs to ensure optimal real time planning of multiple vehicles (moving in either direction along a road), in the presence of a complex obstacle network. Unlike other approaches, here we assume that speed lanes are not present and that different lanes do not need to be maintained for inbound and outbound traffic. Our basic hypothesis is to carry forward the planning task to ensure that a sufficient distance is maintained by each vehicle from all other vehicles, obstacles and road boundaries. We present here a 4-layer planning algorithm that consists of road selection (for selecting the individual roads of traversal to reach the goal), pathway selection (a strategy to avoid and/or overtake obstacles, road diversions and other blockages), pathway distribution (to select the position of a vehicle at every instance of time in a pathway), and trajectory generation (for generating a curve, smooth enough, to allow for the maximum possible speed). Cooperation between vehicles is handled separately at the different levels, the aim being to maximize the separation between vehicles. Simulated results exhibit behaviours of smooth, efficient and safe driving of vehicles in multiple scenarios; along with typical vehicle behaviours including following and overtaking.
Resumo:
This study focuses on the analysis of winter (October-November-December-January-February-March; ONDJFM) storm events and their changes due to increased anthropogenic greenhouse gas concentrations over Europe. In order to assess uncertainties that are due to model formulation, 4 regional climate models (RCMs) with 5 high resolution experiments, and 4 global general circulation models (GCMs) are considered. Firstly, cyclone systems as synoptic scale processes in winter are investigated, as they are a principal cause of the occurrence of extreme, damage-causing wind speeds. This is achieved by use of an objective cyclone identification and tracking algorithm applied to GCMs. Secondly, changes in extreme near-surface wind speeds are analysed. Based on percentile thresholds, the studied extreme wind speed indices allow a consistent analysis over Europe that takes systematic deviations of the models into account. Relative changes in both intensity and frequency of extreme winds and their related uncertainties are assessed and related to changing patterns of extreme cyclones. A common feature of all investigated GCMs is a reduced track density over central Europe under climate change conditions, if all systems are considered. If only extreme (i.e. the strongest 5%) cyclones are taken into account, an increasing cyclone activity for western parts of central Europe is apparent; however, the climate change signal reveals a reduced spatial coherency when compared to all systems, which exposes partially contrary results. With respect to extreme wind speeds, significant positive changes in intensity and frequency are obtained over at least 3 and 20% of the European domain under study (35–72°N and 15°W–43°E), respectively. Location and extension of the affected areas (up to 60 and 50% of the domain for intensity and frequency, respectively), as well as levels of changes (up to +15 and +200% for intensity and frequency, respectively) are shown to be highly dependent on the driving GCM, whereas differences between RCMs when driven by the same GCM are relatively small.
Resumo:
Hybrid multiprocessor architectures which combine re-configurable computing and multiprocessors on a chip are being proposed to transcend the performance of standard multi-core parallel systems. Both fine-grained and coarse-grained parallel algorithm implementations are feasible in such hybrid frameworks. A compositional strategy for designing fine-grained multi-phase regular processor arrays to target hybrid architectures is presented in this paper. The method is based on deriving component designs using classical regular array techniques and composing the components into a unified global design. Effective designs with phase-changes and data routing at run-time are characteristics of these designs. In order to describe the data transfer between phases, the concept of communication domain is introduced so that the producer–consumer relationship arising from multi-phase computation can be treated in a unified way as a data routing phase. This technique is applied to derive new designs of multi-phase regular arrays with different dataflow between phases of computation.
Resumo:
The response of monsoon circulation in the northern and southern hemisphere to 6 ka orbital forcing has been examined in 17 atmospheric general circulation models and 11 coupled ocean–atmosphere general circulation models. The atmospheric response to increased summer insolation at 6 ka in the northern subtropics strengthens the northern-hemisphere summer monsoons and leads to increased monsoonal precipitation in western North America, northern Africa and China; ocean feedbacks amplify this response and lead to further increase in monsoon precipitation in these three regions. The atmospheric response to reduced summer insolation at 6 ka in the southern subtropics weakens the southern-hemisphere summer monsoons and leads to decreased monsoonal precipitation in northern South America, southern Africa and northern Australia; ocean feedbacks weaken this response so that the decrease in rainfall is smaller than might otherwise be expected. The role of the ocean in monsoonal circulation in other regions is more complex. There is no discernable impact of orbital forcing in the monsoon region of North America in the atmosphere-only simulations but a strong increase in precipitation in the ocean–atmosphere simulations. In contrast, there is a strong atmospheric response to orbital forcing over northern India but ocean feedback reduces the strength of the change in the monsoon although it still remains stronger than today. Although there are differences in magnitude and exact location of regional precipitation changes from model to model, the same basic mechanisms are involved in the oceanic modulation of the response to orbital forcing and this gives rise to a robust ensemble response for each of the monsoon systems. Comparison of simulated and reconstructed changes in regional climate suggest that the coupled ocean–atmosphere simulations produce more realistic changes in the northern-hemisphere monsoons than atmosphere-only simulations, though they underestimate the observed changes in precipitation in all regions. Evaluation of the southern-hemisphere monsoons is limited by lack of quantitative reconstructions, but suggest that model skill in simulating these monsoons is limited.
Resumo:
Medium range flood forecasting activities, driven by various meteorological forecasts ranging from high resolution deterministic forecasts to low spatial resolution ensemble prediction systems, share a major challenge in the appropriateness and design of performance measures. In this paper possible limitations of some traditional hydrological and meteorological prediction quality and verification measures are identified. Some simple modifications are applied in order to circumvent the problem of the autocorrelation dominating river discharge time-series and in order to create a benchmark model enabling the decision makers to evaluate the forecast quality and the model quality. Although the performance period is quite short the advantage of a simple cost-loss function as a measure of forecast quality can be demonstrated.
Resumo:
Smart healthcare is a complex domain for systems integration due to human and technical factors and heterogeneous data sources involved. As a part of smart city, it is such a complex area where clinical functions require smartness of multi-systems collaborations for effective communications among departments, and radiology is one of the areas highly relies on intelligent information integration and communication. Therefore, it faces many challenges regarding integration and its interoperability such as information collision, heterogeneous data sources, policy obstacles, and procedure mismanagement. The purpose of this study is to conduct an analysis of data, semantic, and pragmatic interoperability of systems integration in radiology department, and to develop a pragmatic interoperability framework for guiding the integration. We select an on-going project at a local hospital for undertaking our case study. The project is to achieve data sharing and interoperability among Radiology Information Systems (RIS), Electronic Patient Record (EPR), and Picture Archiving and Communication Systems (PACS). Qualitative data collection and analysis methods are used. The data sources consisted of documentation including publications and internal working papers, one year of non-participant observations and 37 interviews with radiologists, clinicians, directors of IT services, referring clinicians, radiographers, receptionists and secretary. We identified four primary phases of data analysis process for the case study: requirements and barriers identification, integration approach, interoperability measurements, and knowledge foundations. Each phase is discussed and supported by qualitative data. Through the analysis we also develop a pragmatic interoperability framework that summaries the empirical findings and proposes recommendations for guiding the integration in the radiology context.