976 resultados para project delay estimation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

ODP Site 1089 is optimally located in order to monitor the occurrence of maxima in Agulhas heat and salt spillage from the Indian to the Atlantic Ocean. Radiolarian-based paleotemperature transfer functions allowed to reconstruct the climatic history for the last 450 kyr at this location. A warm sea surface temperature anomaly during Marine Isotope Stage (MIS) 10 was recognized and traced to other oceanic records along the surface branch of the global thermohaline (THC) circulation system, and is particularly marked at locations where a strong interaction between oceanic and atmospheric overturning cells and fronts occurs. This anomaly is absent in the Vostok ice core deuterium, and in oceanic records from the Antarctic Zone. However, it is present in the deuterium excess record from the Vostok ice core, interpreted as reflecting the temperature at the moisture source site for the snow precipitated at Vostok Station. As atmospheric models predict a subtropical Indian source for such moisture, this provides the necessary teleconnection between East Antarctica and ODP Site 1089, as the subtropical Indian is also the source area of the Agulhas Current, the main climate agent at our study location. The presence of the MIS 10 anomaly in the delta13C foraminiferal records from the same core supports its connection to oceanic mechanisms, linking stronger Agulhas spillover intensity to increased productivity in the study area. We suggest, in analogy to modern oceanographic observations, this to be a consequence of a shallow nutricline, induced by eddy mixing and baroclinic tide generation, which are in turn connected to the flow geometry, and intensity, of the Agulhas Current as it flows past the Agulhas Bank. We interpret the intensified inflow of Agulhas Current to the South Atlantic as responding to the switch between lower and higher amplitude in the insolation forcing in the Agulhas Current source area. This would result in higher SSTs in the Cape Basin during the glacial MIS 10, due to the release into the South Atlantic of the heat previously accumulating in the subtropical and equatorial Indian and Pacific Ocean. If our explanation for the MIS 10 anomaly in terms of an insolation variability switch is correct, we might expect that a future Agulhas SSST anomaly event will further delay the onset of next glacial age. In fact, the insolation forcing conditions for the Holocene (the current interglacial) are very similar to those present during MIS 11 (the interglacial preceding MIS 10), as both periods are characterized by a low insolation variability for the Agulhas Current source area. Natural climatic variability will force the Earth system in the same direction as the anthropogenic global warming trend, and will thus lead to even warmer than expected global temperatures in the near future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We report an investigation on the statistics of group delay for few-mode fibres operating in the weak and strong linear coupling regimes as well as in the intermediate coupling regime. A single expression linking the standard deviation of the group delay spread to the fibre linear mode coupling is validated for any coupling regime, considering up to six linearly polarized guided modes. Furthermore, the study of the probability density function of the group delays allowed deriving and validating an analytical estimation for the maximum group delay spread as a function of linear mode coupling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although aspects of power generation of many offshore renewable devices are well understood, their dynamic responses under high wind and wave conditions are still to be investigated to a great detail. Output only statistical markers are important for these offshore devices, since access to the device is limited and information about the exposure conditions and the true behaviour of the devices are generally partial, limited, and vague or even absent. The markers can summarise and characterise the behaviour of these devices from their dynamic response available as time series data. The behaviour may be linear or nonlinear and consequently a marker that can track the changes in structural situations can be quite important. These markers can then be helpful in assessing the current condition of the structure and can indicate possible intervention, monitoring or assessment. This paper considers a Delay Vector Variance based marker for changes in a tension leg platform tested in an ocean wave basin for structural changes brought about by single column dampers. The approach is based on dynamic outputs of the device alone and is based on the estimation of the nonlinearity of the output signal. The advantages of the selected marker and its response with changing structural properties are discussed. The marker is observed to be important for monitoring the as- deployed structural condition and is sensitive to changes in such conditions. Influence of exposure conditions of wave loading is also discussed in this study based only on experimental data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantile regression (QR) was first introduced by Roger Koenker and Gilbert Bassett in 1978. It is robust to outliers which affect least squares estimator on a large scale in linear regression. Instead of modeling mean of the response, QR provides an alternative way to model the relationship between quantiles of the response and covariates. Therefore, QR can be widely used to solve problems in econometrics, environmental sciences and health sciences. Sample size is an important factor in the planning stage of experimental design and observational studies. In ordinary linear regression, sample size may be determined based on either precision analysis or power analysis with closed form formulas. There are also methods that calculate sample size based on precision analysis for QR like C.Jennen-Steinmetz and S.Wellek (2005). A method to estimate sample size for QR based on power analysis was proposed by Shao and Wang (2009). In this paper, a new method is proposed to calculate sample size based on power analysis under hypothesis test of covariate effects. Even though error distribution assumption is not necessary for QR analysis itself, researchers have to make assumptions of error distribution and covariate structure in the planning stage of a study to obtain a reasonable estimate of sample size. In this project, both parametric and nonparametric methods are provided to estimate error distribution. Since the method proposed can be implemented in R, user is able to choose either parametric distribution or nonparametric kernel density estimation for error distribution. User also needs to specify the covariate structure and effect size to carry out sample size and power calculation. The performance of the method proposed is further evaluated using numerical simulation. The results suggest that the sample sizes obtained from our method provide empirical powers that are closed to the nominal power level, for example, 80%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When we study the variables that a ffect survival time, we usually estimate their eff ects by the Cox regression model. In biomedical research, e ffects of the covariates are often modi ed by a biomarker variable. This leads to covariates-biomarker interactions. Here biomarker is an objective measurement of the patient characteristics at baseline. Liu et al. (2015) has built up a local partial likelihood bootstrap model to estimate and test this interaction e ffect of covariates and biomarker, but the R code developed by Liu et al. (2015) can only handle one variable and one interaction term and can not t the model with adjustment to nuisance variables. In this project, we expand the model to allow adjustment to nuisance variables, expand the R code to take any chosen interaction terms, and we set up many parameters for users to customize their research. We also build up an R package called "lplb" to integrate the complex computations into a simple interface. We conduct numerical simulation to show that the new method has excellent fi nite sample properties under both the null and alternative hypothesis. We also applied the method to analyze data from a prostate cancer clinical trial with acid phosphatase (AP) biomarker.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

NOGUEIRA, Marcelo B. ; MEDEIROS, Adelardo A. D. ; ALSINA, Pablo J. Pose Estimation of a Humanoid Robot Using Images from an Mobile Extern Camera. In: IFAC WORKSHOP ON MULTIVEHICLE SYSTEMS, 2006, Salvador, BA. Anais... Salvador: MVS 2006, 2006.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, we present results from teleseismic P-wave receiver functions (PRFs) obtained in Portugal, Western Iberia. A dense seismic station deployment conducted between 2010 and 2012, in the scope of the WILAS project and covering the entire country, allowed the most spatially extensive probing on the bulk crustal seismic properties of Portugal up to date. The application of the H-kappa stacking algorithm to the PRFs enabled us to estimate the crustal thickness (H) and the average crustal ratio of the P- and S-waves velocities V (p)/V (s) (kappa) for the region. Observations of Moho conversions indicate that this interface is relatively smooth with the crustal thickness ranging between 24 and 34 km, with an average of 30 km. The highest V (p)/V (s) values are found on the Mesozoic-Cenozoic crust beneath the western and southern coastal domain of Portugal, whereas the lowest values correspond to Palaeozoic crust underlying the remaining part of the subject area. An average V (p)/V (s) is found to be 1.72, ranging 1.63-1.86 across the study area, indicating a predominantly felsic composition. Overall, we systematically observe a decrease of V (p)/V (s) with increasing crustal thickness. Taken as a whole, our results indicate a clear distinction between the geological zones of the Variscan Iberian Massif in Portugal, the overall shape of the anomalies conditioned by the shape of the Ibero-Armorican Arc, and associated Late Paleozoic suture zones, and the Meso-Cenozoic basin associated with Atlantic rifting stages. Thickened crust (30-34 km) across the studied region may be inherited from continental collision during the Paleozoic Variscan orogeny. An anomalous crustal thinning to around 28 km is observed beneath the central part of the Central Iberian Zone and the eastern part of South Portuguese Zone.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

NOGUEIRA, Marcelo B. ; MEDEIROS, Adelardo A. D. ; ALSINA, Pablo J. Pose Estimation of a Humanoid Robot Using Images from an Mobile Extern Camera. In: IFAC WORKSHOP ON MULTIVEHICLE SYSTEMS, 2006, Salvador, BA. Anais... Salvador: MVS 2006, 2006.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this project is to develop a three-dimensional block model for a garnet deposit in the Alder Gulch, Madison County, Montana. Garnets occur in pre-Cambrian metamorphic Red Wash gneiss and similar rocks in the vicinity. This project seeks to model the percentage of garnet in a deposit called the Section 25 deposit using the Surpac software. Data available for this work are drillhole, trench and grab sample data obtained from previous exploration of the deposit. The creation of the block model involves validating the data, creating composites of assayed garnet percentages and conducting basic statistics on composites using Surpac statistical tools. Variogram analysis will be conducted on composites to quantify the continuity of the garnet mineralization. A three-dimensional block model will be created and filled with estimates of garnet percentage using different methods of reserve estimation and the results compared.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A camera maps 3-dimensional (3D) world space to a 2-dimensional (2D) image space. In the process it loses the depth information, i.e., the distance from the camera focal point to the imaged objects. It is impossible to recover this information from a single image. However, by using two or more images from different viewing angles this information can be recovered, which in turn can be used to obtain the pose (position and orientation) of the camera. Using this pose, a 3D reconstruction of imaged objects in the world can be computed. Numerous algorithms have been proposed and implemented to solve the above problem; these algorithms are commonly called Structure from Motion (SfM). State-of-the-art SfM techniques have been shown to give promising results. However, unlike a Global Positioning System (GPS) or an Inertial Measurement Unit (IMU) which directly give the position and orientation respectively, the camera system estimates it after implementing SfM as mentioned above. This makes the pose obtained from a camera highly sensitive to the images captured and other effects, such as low lighting conditions, poor focus or improper viewing angles. In some applications, for example, an Unmanned Aerial Vehicle (UAV) inspecting a bridge or a robot mapping an environment using Simultaneous Localization and Mapping (SLAM), it is often difficult to capture images with ideal conditions. This report examines the use of SfM methods in such applications and the role of combining multiple sensors, viz., sensor fusion, to achieve more accurate and usable position and reconstruction information. This project investigates the role of sensor fusion in accurately estimating the pose of a camera for the application of 3D reconstruction of a scene. The first set of experiments is conducted in a motion capture room. These results are assumed as ground truth in order to evaluate the strengths and weaknesses of each sensor and to map their coordinate systems. Then a number of scenarios are targeted where SfM fails. The pose estimates obtained from SfM are replaced by those obtained from other sensors and the 3D reconstruction is completed. Quantitative and qualitative comparisons are made between the 3D reconstruction obtained by using only a camera versus that obtained by using the camera along with a LIDAR and/or an IMU. Additionally, the project also works towards the performance issue faced while handling large data sets of high-resolution images by implementing the system on the Superior high performance computing cluster at Michigan Technological University.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the agri-food sector, measurement and monitoring activities contribute to high quality end products. In particular, considering food of plant origin, several product quality attributes can be monitored. Among the non-destructive measurement techniques, a large variety of optical techniques are available, including hyperspectral imaging (HSI) in the visible/near-infrared (Vis/NIR) range, which, due to the capacity to integrate image analysis and spectroscopy, proved particularly useful in agronomy and food science. Many published studies regarding HSI systems were carried out under controlled laboratory conditions. In contrast, few studies describe the application of HSI technology directly in the field, in particular for high-resolution proximal measurements carried out on the ground. Based on this background, the activities of the present PhD project were aimed at exploring and deepening knowledge in the application of optical techniques for the estimation of quality attributes of agri-food plant products. First, research activities on laboratory trials carried out on apricots and kiwis for the estimation of soluble solids content (SSC) and flesh firmness (FF) through HSI were reported; subsequently, FF was estimated on kiwis using a NIR-sensitive device; finally, the procyanidin content of red wine was estimated through a device based on the pulsed spectral sensitive photometry technique. In the second part, trials were carried out directly in the field to assess the degree of ripeness of red wine grapes by estimating SSC through HSI, and finally a method for the automatic selection of regions of interest in hyperspectral images of the vineyard was developed. The activities described above have revealed the potential of the optical techniques for sorting-line application; moreover, the application of the HSI technique directly in the field has proved particularly interesting, suggesting further investigations to solve a variety of problems arising from the many environmental variables that may affect the results of the analyses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The emissions estimation, both during homologation and standard driving, is one of the new challenges that automotive industries have to face. The new European and American regulation will allow a lower and lower quantity of Carbon Monoxide emission and will require that all the vehicles have to be able to monitor their own pollutants production. Since numerical models are too computationally expensive and approximated, new solutions based on Machine Learning are replacing standard techniques. In this project we considered a real V12 Internal Combustion Engine to propose a novel approach pushing Random Forests to generate meaningful prediction also in extreme cases (extrapolation, very high frequency peaks, noisy instrumentation etc.). The present work proposes also a data preprocessing pipeline for strongly unbalanced datasets and a reinterpretation of the regression problem as a classification problem in a logarithmic quantized domain. Results have been evaluated for two different models representing a pure interpolation scenario (more standard) and an extrapolation scenario, to test the out of bounds robustness of the model. The employed metrics take into account different aspects which can affect the homologation procedure, so the final analysis will focus on combining all the specific performances together to obtain the overall conclusions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The 2005 National Institutes of Health (NIH) Consensus Conference proposed new criteria for diagnosing and scoring the severity of chronic graft-versus-host disease (GVHD). The 2014 NIH consensus maintains the framework of the prior consensus with further refinement based on new evidence. Revisions have been made to address areas of controversy or confusion, such as the overlap chronic GVHD subcategory and the distinction between active disease and past tissue damage. Diagnostic criteria for involvement of mouth, eyes, genitalia, and lungs have been revised. Categories of chronic GVHD should be defined in ways that indicate prognosis, guide treatment, and define eligibility for clinical trials. Revisions have been made to focus attention on the causes of organ-specific abnormalities. Attribution of organ-specific abnormalities to chronic GVHD has been addressed. This paradigm shift provides greater specificity and more accurately measures the global burden of disease attributed to GVHD, and it will facilitate biomarker association studies.