986 resultados para Perfect


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper discusses ECG signal classification after parametrizing the ECG waveforms in the wavelet domain. Signal decomposition using perfect reconstruction quadrature mirror filter banks can provide a very parsimonious representation of ECG signals. In the current work, the filter parameters are adjusted by a numerical optimization algorithm in order to minimize a cost function associated to the filter cut-off sharpness. The goal consists of achieving a better compromise between frequency selectivity and time resolution at each decomposition level than standard orthogonal filter banks such as those of the Daubechies and Coiflet families. Our aim is to optimally decompose the signals in the wavelet domain so that they can be subsequently used as inputs for training to a neural network classifier.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In e-health intervention studies, there are concerns about the reliability of internet-based, self-reported (SR) data and about the potential for identity fraud. This study introduced and tested a novel procedure for assessing the validity of internet-based, SR identity and validated anthropometric and demographic data via measurements performed face-to-face in a validation study (VS). Participants (n = 140) from seven European countries, participating in the Food4Me intervention study which aimed to test the efficacy of personalised nutrition approaches delivered via the internet, were invited to take part in the VS. Participants visited a research centre in each country within 2 weeks of providing SR data via the internet. Participants received detailed instructions on how to perform each measurement. Individual’s identity was checked visually and by repeated collection and analysis of buccal cell DNA for 33 genetic variants. Validation of identity using genomic information showed perfect concordance between SR and VS. Similar results were found for demographic data (age and sex verification). We observed strong intra-class correlation coefficients between SR and VS for anthropometric data (height 0.990, weight 0.994 and BMI 0.983). However, internet-based SR weight was under-reported (Δ −0.70 kg [−3.6 to 2.1], p < 0.0001) and, therefore, BMI was lower for SR data (Δ −0.29 kg m−2 [−1.5 to 1.0], p < 0.0001). BMI classification was correct in 93 % of cases. We demonstrate the utility of genotype information for detection of possible identity fraud in e-health studies and confirm the reliability of internet-based, SR anthropometric and demographic data collected in the Food4Me study.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose a bargaining process supergame over the strategies to play in a non-cooperative game. The agreement reached by players at the end of the bargaining process is the strategy profile that they will play in the original non-cooperative game. We analyze the subgame perfect equilibria of this supergame, and its implications on the original game. We discuss existence, uniqueness, and efficiency of the agreement reachable through this bargaining process. We illustrate the consequences of applying such a process to several common two-player non-cooperative games: the Prisoner’s Dilemma, the Hawk-Dove Game, the Trust Game, and the Ultimatum Game. In each of them, the proposed bargaining process gives rise to Pareto-efficient agreements that are typically different from the Nash equilibrium of the original games.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Soils play a pivotal role in major global biogeochemical cycles (carbon, nutrient and water), while hosting the largest diversity of organisms on land. Because of this, soils deliver fundamental ecosystem services, and management to change a soil process in support of one ecosystem service can either provide co-benefits to other services or can result in trade-offs. In this critical review, we report the state-of-the-art understanding concerning the biogeochemical cycles and biodiversity in soil, and relate these to the provisioning, regulating, supporting and cultural ecosystem services which they underpin. We then outline key knowledge gaps and research challenges, before providing recommendations for management activities to support the continued delivery of ecosystem services from soils. We conclude that although there are knowledge gaps that require further research, enough is known to start improving soils globally. The main challenge is in finding ways to share knowledge with soil managers and policy-makers, so that best-practice management can be implemented. A key element of this knowledge sharing must be in raising awareness of the multiple ecosystem services underpinned by soils, and the natural capital they provide. The International Year of Soils in 2015 presents the perfect opportunity to begin a step-change in how we harness scientific knowledge to bring about more sustainable use of soils for a secure global society.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper investigates the value of a generic storage system within two GB market mechanisms and one ancillary service provision: the wholesale power market, the Balancing Mechanism and Firm Frequency Response (FFR). Three models are evaluated under perfect foresight and fixed horizon which is subsequently extended to explore the impact of a longer foresight on market revenues. The results show that comparatively, the balancing mechanism represents the highest source of potential revenues followed by the wholesale power market and Firm Frequency Response respectively. Longer horizons show diminishing returns, with the 1 day horizon providing the vast majority of total revenues. However storage power capacity utilization benefits from such long horizons. These results could imply that short horizons are very effective in capturing revenues in both the wholesale market and balancing mechanism whereas sizing of a storage system should take into consideration horizon foresight and accuracy for greater benefit.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We describe infinitely scalable pipeline machines with perfect parallelism, in the sense that every instruction of an inline program is executed, on successive data, on every clock tick. Programs with shared data effectively execute in less than a clock tick. We show that pipeline machines are faster than single or multi-core, von Neumann machines for sufficiently many program runs of a sufficiently time consuming program. Our pipeline machines exploit the totality of transreal arithmetic and the known waiting time of statically compiled programs to deliver the interesting property that they need no hardware or software exception handling.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Determining the internal layout of archaeological structures and their uses has always been challenging, particularly in timber-framed or earthen walled buildings where doorways and divisions are difficult to trace. In temperate conditions however, soil formation processes may hold the key to understanding how buildings were used. The abandoned Roman town of Silchester, UK, provides a perfect case study for testing a new approach combining experimental archaeology and micromorphology. The results show that this technique can resolve previously uncertain features of urban architecture such as the presence of a roof and the changes in internal organisation and use over time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

4-Dimensional Variational Data Assimilation (4DVAR) assimilates observations through the minimisation of a least-squares objective function, which is constrained by the model flow. We refer to 4DVAR as strong-constraint 4DVAR (sc4DVAR) in this thesis as it assumes the model is perfect. Relaxing this assumption gives rise to weak-constraint 4DVAR (wc4DVAR), leading to a different minimisation problem with more degrees of freedom. We consider two wc4DVAR formulations in this thesis, the model error formulation and state estimation formulation. The 4DVAR objective function is traditionally solved using gradient-based iterative methods. The principle method used in Numerical Weather Prediction today is the Gauss-Newton approach. This method introduces a linearised `inner-loop' objective function, which upon convergence, updates the solution of the non-linear `outer-loop' objective function. This requires many evaluations of the objective function and its gradient, which emphasises the importance of the Hessian. The eigenvalues and eigenvectors of the Hessian provide insight into the degree of convexity of the objective function, while also indicating the difficulty one may encounter while iterative solving 4DVAR. The condition number of the Hessian is an appropriate measure for the sensitivity of the problem to input data. The condition number can also indicate the rate of convergence and solution accuracy of the minimisation algorithm. This thesis investigates the sensitivity of the solution process minimising both wc4DVAR objective functions to the internal assimilation parameters composing the problem. We gain insight into these sensitivities by bounding the condition number of the Hessians of both objective functions. We also precondition the model error objective function and show improved convergence. We show that both formulations' sensitivities are related to error variance balance, assimilation window length and correlation length-scales using the bounds. We further demonstrate this through numerical experiments on the condition number and data assimilation experiments using linear and non-linear chaotic toy models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Current state-of-the-art global climate models produce different values for Earth’s mean temperature. When comparing simulations with each other and with observations it is standard practice to compare temperature anomalies with respect to a reference period. It is not always appreciated that the choice of reference period can affect conclusions, both about the skill of simulations of past climate, and about the magnitude of expected future changes in climate. For example, observed global temperatures over the past decade are towards the lower end of the range of CMIP5 simulations irrespective of what reference period is used, but exactly where they lie in the model distribution varies with the choice of reference period. Additionally, we demonstrate that projections of when particular temperature levels are reached, for example 2K above ‘pre-industrial’, change by up to a decade depending on the choice of reference period. In this article we discuss some of the key issues that arise when using anomalies relative to a reference period to generate climate projections. We highlight that there is no perfect choice of reference period. When evaluating models against observations, a long reference period should generally be used, but how long depends on the quality of the observations available. The IPCC AR5 choice to use a 1986-2005 reference period for future global temperature projections was reasonable, but a case-by-case approach is needed for different purposes and when assessing projections of different climate variables. Finally, we recommend that any studies that involve the use of a reference period should explicitly examine the robustness of the conclusions to alternative choices.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Future extreme-scale high-performance computing systems will be required to work under frequent component failures. The MPI Forum's User Level Failure Mitigation proposal has introduced an operation, MPI_Comm_shrink, to synchronize the alive processes on the list of failed processes, so that applications can continue to execute even in the presence of failures by adopting algorithm-based fault tolerance techniques. This MPI_Comm_shrink operation requires a fault tolerant failure detection and consensus algorithm. This paper presents and compares two novel failure detection and consensus algorithms. The proposed algorithms are based on Gossip protocols and are inherently fault-tolerant and scalable. The proposed algorithms were implemented and tested using the Extreme-scale Simulator. The results show that in both algorithms the number of Gossip cycles to achieve global consensus scales logarithmically with system size. The second algorithm also shows better scalability in terms of memory and network bandwidth usage and a perfect synchronization in achieving global consensus.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Initializing the ocean for decadal predictability studies is a challenge, as it requires reconstructing the little observed subsurface trajectory of ocean variability. In this study we explore to what extent surface nudging using well-observed sea surface temperature (SST) can reconstruct the deeper ocean variations for the 1949–2005 period. An ensemble made with a nudged version of the IPSLCM5A model and compared to ocean reanalyses and reconstructed datasets. The SST is restored to observations using a physically-based relaxation coefficient, in contrast to earlier studies, which use a much larger value. The assessment is restricted to the regions where the ocean reanalyses agree, i.e. in the upper 500 m of the ocean, although this can be latitude and basin dependent. Significant reconstruction of the subsurface is achieved in specific regions, namely region of subduction in the subtropical Atlantic, below the thermocline in the equatorial Pacific and, in some cases, in the North Atlantic deep convection regions. Beyond the mean correlations, ocean integrals are used to explore the time evolution of the correlation over 20-year windows. Classical fixed depth heat content diagnostics do not exhibit any significant reconstruction between the different existing observation-based references and can therefore not be used to assess global average time-varying correlations in the nudged simulations. Using the physically based average temperature above an isotherm (14 °C) alleviates this issue in the tropics and subtropics and shows significant reconstruction of these quantities in the nudged simulations for several decades. This skill is attributed to the wind stress reconstruction in the tropics, as already demonstrated in a perfect model study using the same model. Thus, we also show here the robustness of this result in an historical and observational context.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ιn the eighteenth century the printing of Greek texts continued to be central to scholarship and discourse. The typography of Greek texts could be characterised as a continuation of French models from the sixteenth century, with a gradual dilution of the complexity of ligatures and abbreviations, mostly through printers in the Low Countries. In Britain, Greek printing was dominated by the university presses, which reproduced conservatively the continental models – exemplified by Oxford's Fell types, which were Dutch adaptations of earlier French models. Hindsight allows us to identify a meaningful development in the Greek types cut by Alexander Wilson for the Foulis Press in Glasgow, but we can argue that in the middle of the eighteenth century Baskerville was considering Greek printing the typographic environment was ripe for a new style of Greek types. The opportunity to cut the types for a New Testament (in an twin edition that included a generous octavo and a large quarto version) would seem perfect for showcasing Baskerville's capacity for innovation. His Greek type maintained the cursive ductus of earlier models, but abandoned complex ligatures and any hint of scribal flourish. He homogenised the modulation of the letter strokes and the treatment of terminals, and normalised the horizontal alignments of all letters. Although the strokes are in some letters too delicate, the narrow set of the style composes a consistent, uniform texture that is a clean break from contemporaneous models. The argument is made that this is the first Greek typeface that can be described as fully typographic in the context of the technology of the time. It sets a pattern that was to be followed, without acknowledgement, by Richard Porson nearly a century and a half later. The typeface received little praise by typographic historians, and was condemned by Victor Scholderer in his retrospective of Greek typography. A survey of typeface reviews in the surrounding decades establishes that the commentators were mostly reproducing the views of an arbitrary typographic orthodoxy, for which only types with direct references to Renaissance models were acceptable. In these comments we detect a bias against someone considered an arriviste in the scholarly printing establishment, as well as a conservative attitude to typographic innovation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the present report and for the first time in the international literature, the impact of the addition of NaCl upon growth and lipid production on the oleaginous yeast Rhodosporidium toruloides was studied. Moreover, equally for first time, lipid production by R. toruloides was performed under non-aseptic conditions. Therefore, the potentiality of R. toruloides DSM 4444 to produce lipid in media containing several initial concentrations of NaCl with glucose employed as carbon source was studied. Preliminary batch-flask trials with increasing amounts of NaCl revealed the tolerance of the strain against NaCl content up to 6.0% (w/v). However, 4.0% (w/v) of NaCl stimulated lipid accumulation for this strain, by enhancing lipid production up to 71.3% (w/w) per dry cell weight. The same amount of NaCl was employed in pasteurized batch-flask cultures in order to investigate the role of the salt as bacterial inhibiting agent. The combination of NaCl and high glucose concentrations was found to satisfactorily suppress bacterial contamination of R. toruloides cultures under these conditions. Batch-bioreactor trials of the yeast in the same media with high glucose content (up to 150 g/L) resulted in satisfactory substrate assimilation, with almost linear kinetic profile for lipid production, regardless of the initial glucose concentration imposed. Finally, fed-batch bioreactor cultures led to the production of 37.2 g/L of biomass, accompanied by 64.5% (w/w) of lipid yield. Lipid yield per unit of glucose consumed received the very satisfactory value of 0.21 g/g, a value amongst the highest ones in the literature. The yeast lipid produced contained mainly oleic acid and to lesser extent palmitic and stearic acids, thus constituting a perfect starting material for “second generation” biodiesel

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An essay on love and liberty in the writings of Gillian Rose, Marquis de Sade, Max Horkheimer and Theodor W. Adorno, written in response to the following provocation: "Encore un effort. A banderole publicitaire carries the breathless descriptions of the new fashions for 1968, when anything goes and details place the accent on this or that part of the body and its adornment: a pair of shoes that come off in a struggle, for example, the heel of one snapped off; a striking checked shirt, with two buttons undone; a light-coloured trench coat (perfect for a May day); a blouson-style jacket that allows easy freedom of movement; place casual slacks worn with an ankle boot. Beauty is in the streets as fashion becomes democratic (or so say the houses of haute couture), while the philosopher of the boudoir extorts us once again to make an effort if we wish to be republicans. Here, to an assembled crowd of sensitive men and women, which petit-maitre or dangerous man of principles would suggest that the only moral system to reinforce political revolution is that of libertinage, the revenge of nature's course against the aberrations of society?"

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper builds on existing theoretical work on sex markets (Della Giusta, Di Tommaso, and Strøm, 2009a). Using data from the British Sexual Attitudes Survey, we aim to replicate the analysis of the demand for paid sex previously conducted for the US (Della Giusta, Di Tommaso, Shima and Strøm, 2009b). We want to test formally the effect of attitudes, risky behaviors and personal characteristics on the demand for paid sex. Findings from empirical studies of clients suggest that personal characteristics (personal and family background, self-perception, perceptions of women, sexual preferences etc), economic factors (education, income, work) as well as attitudes towards risk (both health hazard and risk of being caught where sex work is illegal), and attitude towards relationships and sex are all likely to affect demand. Previous theoretical work has argued that stigma plays a fundamental role in determining both demand and risk, and that in particular due to the presence of stigma the demand for sex and for paid sex are not, as has been argued elsewhere, perfect substitutes. We use data from the British Sexual Attitudes Survey of 2001 to test these hypotheses. We find a positive effect of education (proxy for income), negative effects of professional status (proxies for stigma associated with buying sex), positive and significant effects of all risky behavior variables and no significant effects of variables which measure the relative degree of conservatism in morals. We conclude with some policy implications.