877 resultados para new method


Relevância:

60.00% 60.00%

Publicador:

Resumo:

The performance of flood inundation models is often assessed using satellite observed data; however these data have inherent uncertainty. In this study we assess the impact of this uncertainty when calibrating a flood inundation model (LISFLOOD-FP) for a flood event in December 2006 on the River Dee, North Wales, UK. The flood extent is delineated from an ERS-2 SAR image of the event using an active contour model (snake), and water levels at the flood margin calculated through intersection of the shoreline vector with LiDAR topographic data. Gauged water levels are used to create a reference water surface slope for comparison with the satellite-derived water levels. Residuals between the satellite observed data points and those from the reference line are spatially clustered into groups of similar values. We show that model calibration achieved using pattern matching of observed and predicted flood extent is negatively influenced by this spatial dependency in the data. By contrast, model calibration using water elevations produces realistic calibrated optimum friction parameters even when spatial dependency is present. To test the impact of removing spatial dependency a new method of evaluating flood inundation model performance is developed by using multiple random subsamples of the water surface elevation data points. By testing for spatial dependency using Moran’s I, multiple subsamples of water elevations that have no significant spatial dependency are selected. The model is then calibrated against these data and the results averaged. This gives a near identical result to calibration using spatially dependent data, but has the advantage of being a statistically robust assessment of model performance in which we can have more confidence. Moreover, by using the variations found in the subsamples of the observed data it is possible to assess the effects of observational uncertainty on the assessment of flooding risk.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Motivation: A new method that uses support vector machines (SVMs) to predict protein secondary structure is described and evaluated. The study is designed to develop a reliable prediction method using an alternative technique and to investigate the applicability of SVMs to this type of bioinformatics problem. Methods: Binary SVMs are trained to discriminate between two structural classes. The binary classifiers are combined in several ways to predict multi-class secondary structure. Results: The average three-state prediction accuracy per protein (Q3) is estimated by cross-validation to be 77.07 ± 0.26% with a segment overlap (Sov) score of 73.32 ± 0.39%. The SVM performs similarly to the 'state-of-the-art' PSIPRED prediction method on a non-homologous test set of 121 proteins despite being trained on substantially fewer examples. A simple consensus of the SVM, PSIPRED and PROFsec achieves significantly higher prediction accuracy than the individual methods. Availability: The SVM classifier is available from the authors. Work is in progress to make the method available on-line and to integrate the SVM predictions into the PSIPRED server.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This volume provides a new perspective on the emergence of the modern study of antiquity, Altertumswissenschaft, in eighteenth-century Germany through an exploration of debates that arose over the work of the art historian Johann Joachim Winckelmann between his death in 1768 and the end of the century. This period has long been recognised as particularly formative for the development of modern classical studies, and over the past few decades has received increased attention from historians of scholarship and of ideas. Winckelmann's eloquent articulation of the cultural and aesthetic value of studying the ancient Greeks, his adumbration of a new method for studying ancient artworks, and his provision of a model of cultural-historical development in terms of a succession of period styles, influenced both the public and intra-disciplinary self-image of classics long into the twentieth century. Yet this area of Winckelmann's Nachleben has received relatively little attention compared with the proliferation of studies concerning his importance for late eighteenth-century German art and literature, for historians of sexuality, and his traditional status as a 'founder figure' within the academic disciplines of classical archaeology and the history of art. Harloe restores the figure of Winckelmann to classicists' understanding of the history of their own discipline and uses debates between important figures, such as Christian Gottlob Heyne, Friedrich August Wolf, and Johann Gottfried Herder, to cast fresh light upon the emergence of the modern paradigm of classics as Altertumswissenschaft: the multi-disciplinary, comprehensive, and historicizing study of the ancient world.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Given the background of serious urban pollution in Hong Kong, the intake fraction (iF) of carbon monoxide due to mobile vehicles in urban area of Hong Kong is investigated and estimated to be 600 per million, much higher than those in US urban areas, Helsinki and even Beijing, indicating the high exposure level to urban pollutants in Hong Kong. The dependence of iF to the metrological factors is also discussed. Easterly and northerly winds contribute most to the total iF value. A new method of predicting ventilation rate for a city based on iF concept is proposed. City ventilation rates for different cities are calculated and compared. It is found that Hong Kong has to face the fact that it has the lowest ventilation rate and ACH.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The orientation of the heliospheric magnetic field (HMF) in near‒Earth space is generally a good indicator of the polarity of HMF foot points at the photosphere. There are times, however, when the HMF folds back on itself (is inverted), as indicated by suprathermal electrons locally moving sunward, even though they must ultimately be carrying the heat flux away from the Sun. Analysis of the near‒Earth solar wind during the period 1998–2011 reveals that inverted HMF is present approximately 5.5% of the time and is generally associated with slow, dense solar wind and relatively weak HMF intensity. Inverted HMF is mapped to the coronal source surface, where a new method is used to estimate coronal structure from the potential‒field source‒surface model. We find a strong association with bipolar streamers containing the heliospheric current sheet, as expected, but also with unipolar or pseudostreamers, which contain no current sheet. Because large‒scale inverted HMF is a widely accepted signature of interchange reconnection at the Sun, this finding provides strong evidence for models of the slow solar wind which involve coronal loop opening by reconnection within pseudostreamer belts as well as the bipolar streamer belt. Occurrence rates of bipolar‒ and pseudostreamers suggest that they are equally likely to result in inverted HMF and, therefore, presumably undergo interchange reconnection at approximately the same rate. Given the different magnetic topologies involved, this suggests the rate of reconnection is set externally, possibly by the differential rotation rate which governs the circulation of open solar flux.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

During long-range transport, many distinct processes – including photochemistry, deposition, emissions and mixing – contribute to the transformation of air mass composition. Partitioning the effects of different processes can be useful when considering the sensitivity of chemical transformation to, for example, a changing environment or anthropogenic influence. However, transformation is not observed directly, since mixing ratios are measured, and models must be used to relate changes to processes. Here, four cases from the ITCT-Lagrangian 2004 experiment are studied. In each case, aircraft intercepted a distinct air mass several times during transport over the North Atlantic, providing a unique dataset and quantifying the net changes in composition from all processes. A new framework is presented to deconstruct the change in O3 mixing ratio (Δ O3) into its component processes, which were not measured directly, taking into account the uncertainty in measurements, initial air mass variability and its time evolution. The results show that the net chemical processing (Δ O3chem) over the whole simulation is greater than net physical processing (Δ O3phys) in all cases. This is in part explained by cancellation effects associated with mixing. In contrast, each case is in a regime of either net photochemical destruction (lower tropospheric transport) or production (an upper tropospheric biomass burning case). However, physical processes influence O3 indirectly through addition or removal of precursor gases, so that changes to physical parameters in a model can have a larger effect on Δ O3chem than Δ O3phys. Despite its smaller magnitude, the physical processing distinguishes the lower tropospheric export cases, since the net photochemical O3 change is −5 ppbv per day in all three cases. Processing is quantified using a Lagrangian photochemical model with a novel method for simulating mixing through an ensemble of trajectories and a background profile that evolves with them. The model is able to simulate the magnitude and variability of the observations (of O3, CO, NOy and some hydrocarbons) and is consistent with the time-average OH following air-masses inferred from hydrocarbon measurements alone (by Arnold et al., 2007). Therefore, it is a useful new method to simulate air mass evolution and variability, and its sensitivity to process parameters.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this communication, we describe a new method which has enabled the first patterning of human neurons (derived from the human teratocarcinoma cell line (hNT)) on parylene-C/silicon dioxide substrates. We reveal the details of the nanofabrication processes, cell differentiation and culturing protocols necessary to successfully pattern hNT neurons which are each key aspects of this new method. The benefits in patterning human neurons on silicon chip using an accessible cell line and robust patterning technology are of widespread value. Thus, using a combined technology such as this will facilitate the detailed study of the pathological human brain at both the single cell and network level.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper presents a new method to calculate sky view factors (SVFs) from high resolution urban digital elevation models using a shadow casting algorithm. By utilizing weighted annuli to derive SVF from hemispherical images, the distance light source positions can be predefined and uniformly spread over the whole hemisphere, whereas another method applies a random set of light source positions with a cosine-weighted distribution of sun altitude angles. The 2 methods have similar results based on a large number of SVF images. However, when comparing variations at pixel level between an image generated using the new method presented in this paper with the image from the random method, anisotropic patterns occur. The absolute mean difference between the 2 methods is 0.002 ranging up to 0.040. The maximum difference can be as much as 0.122. Since SVF is a geometrically derived parameter, the anisotropic errors created by the random method must be considered as significant.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We measure infrared absorption spectra of 18 hydrochlorofluorocarbons and hydrofluorocarbons, seven of which do not yet appear in the literature. The spectra are used in a narrowband model of the terrestrial infrared radiation to calculate radiative forcing and global warming potentials. We investigate the sensitivity of the radiative forcing to the absorption spectrum temperature dependence, halocarbon vertical profile, stratospheric adjustment, cloudiness, spectral overlap, and latitude, and we make some recommendations for the reporting of radiative forcings that would help to resolve discrepancies between assessments. We investigate simple methods of estimating instantaneous radiative forcing directly from a molecule's absorption spectrum and we present a new method that agrees to within 0.3% with our narrowband model results.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Seamless phase II/III clinical trials are conducted in two stages with treatment selection at the first stage. In the first stage, patients are randomized to a control or one of k > 1 experimental treatments. At the end of this stage, interim data are analysed, and a decision is made concerning which experimental treatment should continue to the second stage. If the primary endpoint is observable only after some period of follow-up, at the interim analysis data may be available on some early outcome on a larger number of patients than those for whom the primary endpoint is available. These early endpoint data can thus be used for treatment selection. For two previously proposed approaches, the power has been shown to be greater for one or other method depending on the true treatment effects and correlations. We propose a new approach that builds on the previously proposed approaches and uses data available at the interim analysis to estimate these parameters and then, on the basis of these estimates, chooses the treatment selection method with the highest probability of correctly selecting the most effective treatment. This method is shown to perform well compared with the two previously described methods for a wide range of true parameter values. In most cases, the performance of the new method is either similar to or, in some cases, better than either of the two previously proposed methods.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A new method to detect the vibrational circular dichroism (VCD) of a localized part of a chiral molecular system is reported. A local VCD amplifier was implemented, and the distance dependence of the amplification was investigated in a series of peptides. The results indicate a characteristic distance of 2.0±0.3 bonds, which suggests that the amplification is a localized phenomenon. The amplifier can be covalently coupled to a specific part of a molecule, and can be switched ON and OFF electrochemically. By subtracting the VCD spectra obtained when the amplifier is in the ON and OFF states, the VCD of the local environment of the amplifier can be separated from the total VCD spectrum. Switchable local VCD amplification thus makes it possible to “zoom in” on a specific part of a chiral molecule.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Classical regression methods take vectors as covariates and estimate the corresponding vectors of regression parameters. When addressing regression problems on covariates of more complex form such as multi-dimensional arrays (i.e. tensors), traditional computational models can be severely compromised by ultrahigh dimensionality as well as complex structure. By exploiting the special structure of tensor covariates, the tensor regression model provides a promising solution to reduce the model’s dimensionality to a manageable level, thus leading to efficient estimation. Most of the existing tensor-based methods independently estimate each individual regression problem based on tensor decomposition which allows the simultaneous projections of an input tensor to more than one direction along each mode. As a matter of fact, multi-dimensional data are collected under the same or very similar conditions, so that data share some common latent components but can also have their own independent parameters for each regression task. Therefore, it is beneficial to analyse regression parameters among all the regressions in a linked way. In this paper, we propose a tensor regression model based on Tucker Decomposition, which identifies not only the common components of parameters across all the regression tasks, but also independent factors contributing to each particular regression task simultaneously. Under this paradigm, the number of independent parameters along each mode is constrained by a sparsity-preserving regulariser. Linked multiway parameter analysis and sparsity modeling further reduce the total number of parameters, with lower memory cost than their tensor-based counterparts. The effectiveness of the new method is demonstrated on real data sets.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: The electroencephalogram (EEG) may be described by a large number of different feature types and automated feature selection methods are needed in order to reliably identify features which correlate with continuous independent variables. New method: A method is presented for the automated identification of features that differentiate two or more groups inneurologicaldatasets basedupona spectraldecompositionofthe feature set. Furthermore, the method is able to identify features that relate to continuous independent variables. Results: The proposed method is first evaluated on synthetic EEG datasets and observed to reliably identify the correct features. The method is then applied to EEG recorded during a music listening task and is observed to automatically identify neural correlates of music tempo changes similar to neural correlates identified in a previous study. Finally,the method is applied to identify neural correlates of music-induced affective states. The identified neural correlates reside primarily over the frontal cortex and are consistent with widely reported neural correlates of emotions. Comparison with existing methods: The proposed method is compared to the state-of-the-art methods of canonical correlation analysis and common spatial patterns, in order to identify features differentiating synthetic event-related potentials of different amplitudes and is observed to exhibit greater performance as the number of unique groups in the dataset increases. Conclusions: The proposed method is able to identify neural correlates of continuous variables in EEG datasets and is shown to outperform canonical correlation analysis and common spatial patterns.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The theory of wave–mean flow interaction requires a partition of the atmospheric flow into a notional background state and perturbations to it. Here, a background state, known as the Modified Lagrangian Mean (MLM), is defined as the zonally symmetric state obtained by requiring that every potential vorticity (PV) contour lying within an isentropic layer encloses the same mass and circulation as in the full flow. For adiabatic and frictionless flow, these two integral properties are time-invariant and the MLM state is a steady solution of the primitive equations. The time dependence in the adiabatic flow is put into the perturbations, which can be described by a wave-activity conservation law that is exact even at large amplitude. Furthermore, the effects of non-conservative processes on wave activity can be calculated from the conservation law. A new method to calculate the MLM state is introduced, where the position of the lower boundary is obtained as part of the solution. The results are illustrated using Northern Hemisphere ERA-Interim data. The MLM state evolves slowly, implying that the net non-conservative effects are weak. Although ‘adiabatic eddy fluxes’ cannot affect the MLM state, the effects of Rossby-wave breaking, PV filamentation and subsequent dissipation result in sharpening of the polar vortex edge and meridional shifts in the MLM zonal flow, both at tropopause level and on the winter stratospheric vortex. The rate of downward migration of wave activity during stratospheric sudden warmings is shown to be given by the vertical scale associated with polar vortex tilt divided by the time-scale for wave dissipation estimated from the wave-activity conservation law. Aspects of troposphere–stratosphere interaction are discussed. The new framework is suitable to examine the climate and its interactions with disturbances, such as midlatitude storm tracks, and makes a clean partition between adiabatic and non-conservative processes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Mixing layer height (MLH) is one of the key parameters in describing lower tropospheric dynamics and capturing its diurnal variability is crucial, especially for interpreting surface observations. In this paper we introduce a method for identifying MLH below the minimum range of a scanning Doppler lidar when operated at vertical. The method we propose is based on velocity variance in low-elevation-angle conical scanning and is applied to measurements in two very different coastal environments: Limassol, Cyprus, during summer and Loviisa, Finland, during winter. At both locations, the new method agrees well with MLH derived from turbulent kinetic energy dissipation rate profiles obtained from vertically pointing measurements. The low-level scanning routine frequently indicated non-zero MLH less than 100 m above the surface. Such low MLHs were more common in wintertime Loviisa on the Baltic Sea coast than during summertime in Mediterranean Limassol.