997 resultados para Problem Resolution


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The issue: The European Union's pre-crisis growth performance was disappointing enough, but the performance has been even more dismal since the onset of the crisis. Weak growth is undermining private and public deleveraging,and is fuelling continued banking fragility. Persistently high unemployment is eroding skills, discouraging labour market participation and undermining the EU’s long-term growth potential. Low overall growth is making it much tougher for the hard-hit economies in southern Europe to recover competitiveness and regain control of their public finances. Stagnation would reduce the attractiveness of Europe for investment. Under these conditions, Europe's social models are bound to prove unsustainable. Policy Challenge: The European Union's weak long-term growth potential and unsatisfactory recovery from the crisis represent a major policy challenge. Over and above the structural reform agenda, which vitally important, bold policy action is needed. The priority is to get bank credit going. Banking problems need to be assessed properly and bank resolution and recapitalisation should be pursued. Second, fostering the reallocation of factors to the most productive firms and the sectors that contribute to aggregate rebalancing is vital. Addressing intra-euro area competitiveness divergence is essential to support growth in southern Europe. Third, the speed of fiscal adjustment needs to be appropriate and EU funds should be front loaded to countries in deep recession, while the European Investment Bank should increase investment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High-impact, localized intense rainfall episodes represent a major socio-economic problem for societies worldwide, and at the same time these events are notoriously difficult to simulate properly in climate models. Here, the authors investigate how horizontal resolution and model formulation influence this issue by applying the HARMONIE regional climate model (HCLIM) with three different setups; two using convection parameterization at 15 and 6.25 km horizontal resolution (the latter within the “grey-zone” scale), with lateral boundary conditions provided by ERA-Interim reanalysis and integrated over a pan-European domain, and one with explicit convection at 2 km resolution (HCLIM2) over the Alpine region driven by the 15 km model. Seven summer seasons were sampled and validated against two high-resolution observational data sets. All HCLIM versions underestimate the number of dry days and hours by 20-40%, and overestimate precipitation over the Alpine ridge. Also, only modest added value were found of “grey-zone” resolution. However, the single most important outcome is the substantial added value in HCLIM2 compared to the coarser model versions at sub-daily time scales. It better captures the local-to-regional spatial patterns of precipitation reflecting a more realistic representation of the local and meso-scale dynamics. Further, the duration and spatial frequency of precipitation events, as well as extremes, are closer to observations. These characteristics are key ingredients in heavy rainfall events and associated flash floods, and the outstanding results using HCLIM in convection-permitting setting are convincing and encourage further use of the model to study changes in such events in changing climates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Visualisation of multiple isoforms of kappa-casein on 2-D gels is restricted by the abundant alpha- and beta-caseins that not only limit gel loading but also migrate to similar regions as the more acidic kappa-casein isoforms. To overcome this problem, we took advantage of the absence of cysteine residues in alpha(S1)- and beta-casein by devising an affinity enrichment procedure based on reversible biotinylation of cysteine residues. Affinity capture of cysteine-containing proteins on avidin allowed the removal of the vast majority of alpha(S1)- and beta-casein, and on subsequent 2-D gel analysis 16 gel spots were identified as kappa-casein by PMF. Further analysis of the C-terminal tryptic peptide along with structural predictions based on mobility on the 2-D gel allowed us to assign identities to each spot in terms of genetic variant (A or B), phosphorylation status (1, 2 or 3) and glycosylation status (from 0 to 6). Eight isoforms of the A and B variants with the same PTMs were observed. When the casein fraction of milk from a single cow, homozygous for the B variant of kappa-casein, was used as the starting material, 17 isoforms from 13 gel spots were characterised. Analysis of isoforms of low abundance proved challenging due to the low amount of material that could be extracted from the gels as well as the lability of the PTMs during MS analysis. However, we were able to identify a previously unrecognised site, T-166, that could be phosphorylated or glycosylated. Despite many decades of analysis of milk proteins, the reasons for this high level of heterogeneity are still not clear.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Magnetoencephalography (MEG) is a non-invasive brain imaging technique with the potential for very high temporal and spatial resolution of neuronal activity. The main stumbling block for the technique has been that the estimation of a neuronal current distribution, based on sensor data outside the head, is an inverse problem with an infinity of possible solutions. Many inversion techniques exist, all using different a-priori assumptions in order to reduce the number of possible solutions. Although all techniques can be thoroughly tested in simulation, implicit in the simulations are the experimenter's own assumptions about realistic brain function. To date, the only way to test the validity of inversions based on real MEG data has been through direct surgical validation, or through comparison with invasive primate data. In this work, we constructed a null hypothesis that the reconstruction of neuronal activity contains no information on the distribution of the cortical grey matter. To test this, we repeatedly compared rotated sections of grey matter with a beamformer estimate of neuronal activity to generate a distribution of mutual information values. The significance of the comparison between the un-rotated anatomical information and the electrical estimate was subsequently assessed against this distribution. We found that there was significant (P < 0.05) anatomical information contained in the beamformer images across a number of frequency bands. Based on the limited data presented here, we can say that the assumptions behind the beamformer algorithm are not unreasonable for the visual-motor task investigated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We perform numerical simulations on a model describing a Brillouin-based temperature and strain sensor, testing its response when it is probed with relatively short pulses. Experimental results were recently published [e.g., Opt. Lett. 24, 510 (1999)] that showed a broadening of the Brillouin loss curve when the probe pulse duration is reduced, followed by a sudden and rather surprising reduction of the linewidth when the pulse duration gets shorter than the acoustic relaxation time. Our study reveals the processes responsible for this behavior. We give a clear physical insight into the problem, allowing us to define the best experimental conditions required for one to take the advantage of this effect.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The parallel resolution procedures based on graph structures method are presented. OR-, AND- and DCDP- parallel inference on connection graph representation is explored and modifications to these algorithms using heuristic estimation are proposed. The principles for designing these heuristic functions are thoroughly discussed. The colored clause graphs resolution principle is presented. The comparison of efficiency (on the Steamroller problem) is carried out and the results are presented. The parallel unification algorithm used in the parallel inference procedure is briefly outlined in the final part of the paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We perform numerical simulations on a model describing a Brillouin-based temperature and strain sensor, testing its response when it is probed with relatively short pulses. Experimental results were recently published [e.g., Opt. Lett. 24, 510 (1999)] that showed a broadening of the Brillouin loss curve when the probe pulse duration is reduced, followed by a sudden and rather surprising reduction of the linewidth when the pulse duration gets shorter than the acoustic relaxation time. Our study reveals the processes responsible for this behavior. We give a clear physical insight into the problem, allowing us to define the best experimental conditions required for one to take the advantage of this effect.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation develops a new mathematical approach that overcomes the effect of a data processing phenomenon known as “histogram binning” inherent to flow cytometry data. A real-time procedure is introduced to prove the effectiveness and fast implementation of such an approach on real-world data. The histogram binning effect is a dilemma posed by two seemingly antagonistic developments: (1) flow cytometry data in its histogram form is extended in its dynamic range to improve its analysis and interpretation, and (2) the inevitable dynamic range extension introduces an unwelcome side effect, the binning effect, which skews the statistics of the data, undermining as a consequence the accuracy of the analysis and the eventual interpretation of the data. ^ Researchers in the field contended with such a dilemma for many years, resorting either to hardware approaches that are rather costly with inherent calibration and noise effects; or have developed software techniques based on filtering the binning effect but without successfully preserving the statistical content of the original data. ^ The mathematical approach introduced in this dissertation is so appealing that a patent application has been filed. The contribution of this dissertation is an incremental scientific innovation based on a mathematical framework that will allow researchers in the field of flow cytometry to improve the interpretation of data knowing that its statistical meaning has been faithfully preserved for its optimized analysis. Furthermore, with the same mathematical foundation, proof of the origin of such an inherent artifact is provided. ^ These results are unique in that new mathematical derivations are established to define and solve the critical problem of the binning effect faced at the experimental assessment level, providing a data platform that preserves its statistical content. ^ In addition, a novel method for accumulating the log-transformed data was developed. This new method uses the properties of the transformation of statistical distributions to accumulate the output histogram in a non-integer and multi-channel fashion. Although the mathematics of this new mapping technique seem intricate, the concise nature of the derivations allow for an implementation procedure that lends itself to a real-time implementation using lookup tables, a task that is also introduced in this dissertation. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation develops a new mathematical approach that overcomes the effect of a data processing phenomenon known as "histogram binning" inherent to flow cytometry data. A real-time procedure is introduced to prove the effectiveness and fast implementation of such an approach on real-world data. The histogram binning effect is a dilemma posed by two seemingly antagonistic developments: (1) flow cytometry data in its histogram form is extended in its dynamic range to improve its analysis and interpretation, and (2) the inevitable dynamic range extension introduces an unwelcome side effect, the binning effect, which skews the statistics of the data, undermining as a consequence the accuracy of the analysis and the eventual interpretation of the data. Researchers in the field contended with such a dilemma for many years, resorting either to hardware approaches that are rather costly with inherent calibration and noise effects; or have developed software techniques based on filtering the binning effect but without successfully preserving the statistical content of the original data. The mathematical approach introduced in this dissertation is so appealing that a patent application has been filed. The contribution of this dissertation is an incremental scientific innovation based on a mathematical framework that will allow researchers in the field of flow cytometry to improve the interpretation of data knowing that its statistical meaning has been faithfully preserved for its optimized analysis. Furthermore, with the same mathematical foundation, proof of the origin of such an inherent artifact is provided. These results are unique in that new mathematical derivations are established to define and solve the critical problem of the binning effect faced at the experimental assessment level, providing a data platform that preserves its statistical content. In addition, a novel method for accumulating the log-transformed data was developed. This new method uses the properties of the transformation of statistical distributions to accumulate the output histogram in a non-integer and multi-channel fashion. Although the mathematics of this new mapping technique seem intricate, the concise nature of the derivations allow for an implementation procedure that lends itself to a real-time implementation using lookup tables, a task that is also introduced in this dissertation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The sensitivity of the tropics to climate change, particularly the amplitude of glacial-to-interglacial changes in sea surface temperature (SST), is one of the great controversies in paleoclimatology. Here we reassess faunal estimates of ice age SSTs, focusing on the problem of no-analog planktonic foraminiferal assemblages in the equatorial oceans that confounds both classical transfer function and modern analog methods. A new calibration strategy developed here, which uses past variability of species to define robust faunal assemblages, solves the no-analog problem and reveals ice age cooling of 5° to 6°C in the equatorial current systems of the Atlantic and eastern Pacific Oceans. Classical transfer functions underestimated temperature changes in some areas of the tropical oceans because core-top assemblages misrepresented the ice age faunal assemblages. Our finding is consistent with some geochemical estimates and model predictions of greater ice age cooling in the tropics than was inferred by Climate: Long-Range Investigation, Mapping, and Prediction (CLIMAP) [1981] and thus may help to resolve a long-standing controversy. Our new foraminiferal transfer function suggests that such cooling was limited to the equatorial current systems, however, and supports CLIMAP's inference of stability of the subtropical gyre centers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the summary of the key objectives, instrumentation and logistic details, goals, and initial scientific findings of the European Marie Curie Action SAPUSS project carried out in the western Mediterranean Basin (WMB) during September-October in autumn 2010. The key SAPUSS objective is to deduce aerosol source characteristics and to understand the atmospheric processes responsible for their generations and transformations - both horizontally and vertically in the Mediterranean urban environment. In order to achieve so, the unique approach of SAPUSS is the concurrent measurements of aerosols with multiple techniques occurring simultaneously in six monitoring sites around the city of Barcelona (NE Spain): a main road traffic site, two urban background sites, a regional background site and two urban tower sites (150 m and 545 m above sea level, 150 m and 80 m above ground, respectively). SAPUSS allows us to advance our knowledge sensibly of the atmospheric chemistry and physics of the urban Mediterranean environment. This is well achieved only because of both the three dimensional spatial scale and the high sampling time resolution used. During SAPUSS different meteorological regimes were encountered, including warm Saharan, cold Atlantic, wet European and stagnant regional ones. The different meteorology of such regimes is herein described. Additionally, we report the trends of the parameters regulated by air quality purposes (both gaseous and aerosol mass concentrations); and we also compare the six monitoring sites. High levels of traffic-related gaseous pollutants were measured at the urban ground level monitoring sites, whereas layers of tropospheric ozone were recorded at tower levels. Particularly, tower level night-time average ozone concentrations (80 +/- 25 mu g m(-3)) were up to double compared to ground level ones. The examination of the vertical profiles clearly shows the predominant influence of NOx on ozone concentrations, and a source of ozone aloft. Analysis of the particulate matter (PM) mass concentrations shows an enhancement of coarse particles (PM2.5-10) at the urban ground level (+64 %, average 11.7 mu g m(-3)) but of fine ones (PM1) at urban tower level (+28 %, average 14.4 mu g m(-3)). These results show complex dynamics of the size-resolved PM mass at both horizontal and vertical levels of the study area. Preliminary modelling findings reveal an underestimation of the fine accumulation aerosols. In summary, this paper lays the foundation of SAPUSS, an integrated study of relevance to many other similar urban Mediterranean coastal environment sites.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Testing for two-sample differences is challenging when the differences are local and only involve a small portion of the data. To solve this problem, we apply a multi- resolution scanning framework that performs dependent local tests on subsets of the sample space. We use a nested dyadic partition of the sample space to get a collection of windows and test for sample differences within each window. We put a joint prior on the states of local hypotheses that allows both vertical and horizontal message passing among the partition tree to reflect the spatial dependency features among windows. This information passing framework is critical to detect local sample differences. We use both the loopy belief propagation algorithm and MCMC to get the posterior null probability on each window. These probabilities are then used to report sample differences based on decision procedures. Simulation studies are conducted to illustrate the performance. Multiple testing adjustment and convergence of the algorithms are also discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis details the top-down fabrication of nanostructures on Si and Ge substrates by electron beam lithography (EBL). Various polymeric resist materials were used to create nanopatterns by EBL and Chapter 1 discusses the development characteristics of these resists. Chapter 3 describes the processing parameters, resolution and topographical and structural changes of a new EBL resist known as ‘SML’. A comparison between SML and the standard resists PMMA and ZEP520A was undertaken to determine the suitability of SML as an EBL resist. It was established that SML is capable of high-resolution patterning and showed good pattern transfer capabilities. Germanium is a desirable material for use in microelectronic applications due to a number of superior qualities over silicon. EBL patterning of Ge with high-resolution hydrogen silsesquioxane (HSQ) resist is however difficult due to the presence of native surface oxides. Thus, to combat this problem a new technique for passivating Ge surfaces prior to EBL processes is detailed in Chapter 4. The surface passivation was carried out using simple acids like citric acid and acetic acid. The acids were gentle on the surface and enabled the formation of high-resolution arrays of Ge nanowires using HSQ resist. Chapter 5 details the directed self-assembly (DSA) of block copolymers (BCPs) on EBL patterned Si and, for the very first time, Ge surfaces. DSA of BCPs on template substrates is a promising technology for high volume and cost effective nanofabrication. The BCP employed for this study was poly (styrene-b-ethylene oxide) and the substrates were pre-defined by HSQ templates produced by EBL. The DSA technique resulted into pattern rectification (ordering in BCP) and in pattern multiplication within smaller areas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background - Image blurring in Full Field Digital Mammography (FFDM) is reported to be a problem within many UK breast screening units resulting in significant proportion of technical repeats/recalls. Our study investigates monitors of differing pixel resolution, and whether there is a difference in blurring detection between a 2.3 MP technical review monitor and a 5MP standard reporting monitor. Methods - Simulation software was created to induce different magnitudes of blur on 20 artifact free FFDM screening images. 120 blurred and non-blurred images were randomized and displayed on the 2.3 and 5MP monitors; they were reviewed by 28 trained observers. Monitors were calibrated to the DICOM Grayscale Standard Display Function. T-test was used to determine whether significant differences exist in blurring detection between the monitors. Results - The blurring detection rate on the 2.3MP monitor for 0.2, 0.4, 0.6, 0.8 and 1 mm blur was 46, 59, 66, 77and 78% respectively; and on the 5MP monitor 44, 70, 83 , 96 and 98%. All the non-motion images were identified correctly. A statistical difference (p <0.01) in the blurring detection rate between the two monitors was demonstrated. Conclusions - Given the results of this study and knowing that monitors as low as 1 MP are used in clinical practice, we speculate that technical recall/repeat rates because of blurring could be reduced if higher resolution monitors are used for technical review at the time of imaging. Further work is needed to determine monitor minimum specification for visual blurring detection.