890 resultados para Robust tori
Resumo:
robreg provides a number of robust estimators for linear regression models. Among them are the high breakdown-point and high efficiency MM-estimator, the Huber and bisquare M-estimator, and the S-estimator, each supporting classic or robust standard errors. Furthermore, basic versions of the LMS/LQS (least median of squares) and LTS (least trimmed squares) estimators are provided. Note that the moremata package, also available from SSC, is required.
Resumo:
Many location-based services target users in indoor environments. Similar to the case of dense urban areas where many obstacles exist, indoor localization techniques suffer from outlying measurements caused by severe multipath propaga??tion and non-line-of-sight (NLOS) reception. Obstructions in the signal path caused by static or mobile objects downgrade localization accuracy. We use robust multipath mitigation techniques to detect and filter out outlying measurements in indoor environments. We validate our approach using a power-based lo??calization system with GSM. We conducted experiments without any prior knowledge of the tracked device's radio settings or the indoor radio environment. We obtained localization errors in the range of 3m even if the sensors had NLOS links to the target device.
Resumo:
Io's plasma and neutral tori play significant roles in the Jovian magnetosphere. We present feasibility studies of measuring low-energy energetic neutral atoms (LENAs) generated from the Io tori. We calculate the LENA flux between 10 eV and 3 keV. The energy range includes the corotational plasma flow energy. The expected differential flux at Ganymede distance is typically 10(3)-10(5) cm(-2) s(-1) sr(-1) eV(-1) near the energy of the corotation. It is above the detection level of the planned LENA sensor that is to be flown to the Jupiter system with integration times of 0.01-1 s. The flux has strong asymmetry with respective to the Io phase. The observations will exhibit periodicities, which can be attributed to the Jovian magnetosphere rotation and the rotation of Io around Jupiter. The energy spectra will exhibit dispersion signatures, because of the non-negligible flight time of the LENAs from Io to the satellite. In 2030, the Jupiter exploration mission JUICE will conduct a LENA measurement with a LENA instrument, the Jovian Neutrals Analyzer (JNA). From the LENA observations collected by JNA, we will be able to derive characteristic quantities, such as the density, velocity, velocity distribution function, and composition of plasma-torus particles. We also discuss the possible physics to be explored by JNA in addition to the constraints for operating the sensor and analyzing the obtained dataset. (C) 2015 Elsevier Ltd. All rights reserved.
Plant diversity effects on grassland productivity are robust to both nutrient enrichment and drought
Resumo:
Global change drivers are rapidly altering resource availability and biodiversity. While there is consensus that greater biodiversity increases the functioning of ecosystems, the extent to which biodiversity buffers ecosystem productivity in response to changes in resource availability remains unclear. We use data from 16 grassland experiments across North America and Europe that manipulated plant species richness and one of two essential resources—soil nutrients or water—to assess the direction and strength of the interaction between plant diversity and resource alteration on above-ground productivity and net biodiversity, complementarity, and selection effects. Despite strong increases in productivity with nutrient addition and decreases in productivity with drought, we found that resource alterations did not alter biodiversity–ecosystem functioning relationships. Our results suggest that these relationships are largely determined by increases in complementarity effects along plant species richness gradients. Although nutrient addition reduced complementarity effects at high diversity, this appears to be due to high biomass in monocultures under nutrient enrichment. Our results indicate that diversity and the complementarity of species are important regulators of grassland ecosystem productivity, regardless of changes in other drivers of ecosystem function.
Resumo:
Proton therapy is growing increasingly popular due to its superior dose characteristics compared to conventional photon therapy. Protons travel a finite range in the patient body and stop, thereby delivering no dose beyond their range. However, because the range of a proton beam is heavily dependent on the tissue density along its beam path, uncertainties in patient setup position and inherent range calculation can degrade thedose distribution significantly. Despite these challenges that are unique to proton therapy, current management of the uncertainties during treatment planning of proton therapy has been similar to that of conventional photon therapy. The goal of this dissertation research was to develop a treatment planning method and a planevaluation method that address proton-specific issues regarding setup and range uncertainties. Treatment plan designing method adapted to proton therapy: Currently, for proton therapy using a scanning beam delivery system, setup uncertainties are largely accounted for by geometrically expanding a clinical target volume (CTV) to a planning target volume (PTV). However, a PTV alone cannot adequately account for range uncertainties coupled to misaligned patient anatomy in the beam path since it does not account for the change in tissue density. In order to remedy this problem, we proposed a beam-specific PTV (bsPTV) that accounts for the change in tissue density along the beam path due to the uncertainties. Our proposed method was successfully implemented, and its superiority over the conventional PTV was shown through a controlled experiment.. Furthermore, we have shown that the bsPTV concept can be incorporated into beam angle optimization for better target coverage and normal tissue sparing for a selected lung cancer patient. Treatment plan evaluation method adapted to proton therapy: The dose-volume histogram of the clinical target volume (CTV) or any other volumes of interest at the time of planning does not represent the most probable dosimetric outcome of a given plan as it does not include the uncertainties mentioned earlier. Currently, the PTV is used as a surrogate of the CTV’s worst case scenario for target dose estimation. However, because proton dose distributions are subject to change under these uncertainties, the validity of the PTV analysis method is questionable. In order to remedy this problem, we proposed the use of statistical parameters to quantify uncertainties on both the dose-volume histogram and dose distribution directly. The robust plan analysis tool was successfully implemented to compute both the expectation value and its standard deviation of dosimetric parameters of a treatment plan under the uncertainties. For 15 lung cancer patients, the proposed method was used to quantify the dosimetric difference between the nominal situation and its expected value under the uncertainties.
Resumo:
Hierarchical linear growth model (HLGM), as a flexible and powerful analytic method, has played an increased important role in psychology, public health and medical sciences in recent decades. Mostly, researchers who conduct HLGM are interested in the treatment effect on individual trajectories, which can be indicated by the cross-level interaction effects. However, the statistical hypothesis test for the effect of cross-level interaction in HLGM only show us whether there is a significant group difference in the average rate of change, rate of acceleration or higher polynomial effect; it fails to convey information about the magnitude of the difference between the group trajectories at specific time point. Thus, reporting and interpreting effect sizes have been increased emphases in HLGM in recent years, due to the limitations and increased criticisms for statistical hypothesis testing. However, most researchers fail to report these model-implied effect sizes for group trajectories comparison and their corresponding confidence intervals in HLGM analysis, since lack of appropriate and standard functions to estimate effect sizes associated with the model-implied difference between grouping trajectories in HLGM, and also lack of computing packages in the popular statistical software to automatically calculate them. ^ The present project is the first to establish the appropriate computing functions to assess the standard difference between grouping trajectories in HLGM. We proposed the two functions to estimate effect sizes on model-based grouping trajectories difference at specific time, we also suggested the robust effect sizes to reduce the bias of estimated effect sizes. Then, we applied the proposed functions to estimate the population effect sizes (d ) and robust effect sizes (du) on the cross-level interaction in HLGM by using the three simulated datasets, and also we compared the three methods of constructing confidence intervals around d and du recommended the best one for application. At the end, we constructed 95% confidence intervals with the suitable method for the effect sizes what we obtained with the three simulated datasets. ^ The effect sizes between grouping trajectories for the three simulated longitudinal datasets indicated that even though the statistical hypothesis test shows no significant difference between grouping trajectories, effect sizes between these grouping trajectories can still be large at some time points. Therefore, effect sizes between grouping trajectories in HLGM analysis provide us additional and meaningful information to assess group effect on individual trajectories. In addition, we also compared the three methods to construct 95% confident intervals around corresponding effect sizes in this project, which handled with the uncertainty of effect sizes to population parameter. We suggested the noncentral t-distribution based method when the assumptions held, and the bootstrap bias-corrected and accelerated method when the assumptions are not met.^
Resumo:
Arctic permafrost landscapes are among the most vulnerable and dynamic landscapes globally, but due to their extent and remoteness most of the landscape changes remain unnoticed. In order to detect disturbances in these areas we developed an automated processing chain for the calculation and analysis of robust trends of key land surface indicators based on the full record of available Landsat TM, ETM +, and OLI data. The methodology was applied to the ~ 29,000 km**2 Lena Delta in Northeast Siberia, where robust trend parameters (slope, confidence intervals of the slope, and intercept) were calculated for Tasseled Cap Greenness, Wetness and Brightness, NDVI, and NDWI, and NDMI based on 204 Landsat scenes for the observation period between 1999 and 2014. The resulting datasets revealed regional greening trends within the Lena Delta with several localized hot-spots of change, particularly in the vicinity of the main river channels. With a 30-m spatial resolution various permafrost-thaw related processes and disturbances, such as thermokarst lake expansion and drainage, fluvial erosion, and coastal changes were detected within the Lena Delta region, many of which have not been noticed or described before. Such hotspots of permafrost change exhibit significantly different trend parameters compared to non-disturbed areas. The processed dataset, which is made freely available through the data archive PANGAEA, will be a useful resource for further process specific analysis by researchers and land managers. With the high level of automation and the use of the freely available Landsat archive data, the workflow is scalable and transferrable to other regions, which should enable the comparison of land surface changes in different permafrost affected regions and help to understand and quantify permafrost landscape dynamics.
Resumo:
Predicting species potential and future distribution has become a relevant tool in biodiversity monitoring and conservation. In this data article we present the suitability map of a virtual species generated based on two bioclimatic variables, and a dataset containing more than 700.000 random observations at the extent of Europe. The dataset includes spatial attributes such as, distance to roads, protected areas, country codes, and the habitat suitability of two spatially clustered species (grassland and forest species) and a wide spread species.
Resumo:
Recently, vision-based advanced driver-assistance systems (ADAS) have received a new increased interest to enhance driving safety. In particular, due to its high performance–cost ratio, mono-camera systems are arising as the main focus of this field of work. In this paper we present a novel on-board road modeling and vehicle detection system, which is a part of the result of the European I-WAY project. The system relies on a robust estimation of the perspective of the scene, which adapts to the dynamics of the vehicle and generates a stabilized rectified image of the road plane. This rectified plane is used by a recursive Bayesian classi- fier, which classifies pixels as belonging to different classes corresponding to the elements of interest of the scenario. This stage works as an intermediate layer that isolates subsequent modules since it absorbs the inherent variability of the scene. The system has been tested on-road, in different scenarios, including varied illumination and adverse weather conditions, and the results have been proved to be remarkable even for such complex scenarios.
Resumo:
In this work, robustness and stability of continuum damage models applied to material failure in soft tissues are addressed. In the implicit damage models equipped with softening, the presence of negative eigenvalues in the tangent elemental matrix degrades the condition number of the global matrix, leading to a reduction of the computational performance of the numerical model. Two strategies have been adapted from literature to improve the aforementioned computational performance degradation: the IMPL-EX integration scheme [Oliver,2006], which renders the elemental matrix contribution definite positive, and arclength-type continuation methods [Carrera,1994], which allow to capture the unstable softening branch in brittle ruptures. The IMPL-EX integration scheme has as a major drawback the need to use small time steps to keep numerical error below an acceptable value. A convergence study, limiting the maximum allowed increment of internal variables in the damage model, is presented. Finally, numerical simulation of failure problems with fibre reinforced materials illustrates the performance of the adopted methodology.
Resumo:
In air transportation, airline profitability is influenced by the airline's ability to build flight schedules. In order to generate operational schedules, airlines engage in a complex decision-making process, referred to as airline schedule planning. Up to now, the generation of flight schedules has been separated and optimized sequentially. The schedule design has been traditionally decomposed into two sequential steps. The frequency planning and the timetable development. The purpose of the second problem of schedule development, fleet assignment, is to assign available aircraft types to flight legs such that seating capacity on an assigned aircraft matches closely with flight demand and such that costs are minimized. Our work integrates these planning phases into one single model in order to produce more economical solutions and create fewer incompatibilities between the decisions. We propose an integrated robust approach for the schedule development step. We design the timetable ensuring that enough time is available to perform passengers’ flight connections, making the system robust avoiding misconnected passengers. An application of the model for a simplified IBERIA network is shown.
Resumo:
This paper focuses on the railway rolling stock circulation problem in rapid transit networks, in which frequencies are high and distances are relatively short. Although the distances are not very large, service times are high due to the large number of intermediate stops required to allow proper passenger flow. The main complicating issue is the fact that the available capacity at depot stations is very low, and both capacity and rolling stock are shared between different train lines. This forces the introduction of empty train movements and rotation maneuvers, to ensure sufficient station capacity and rolling stock availability. However, these shunting operations may sometimes be difficult to perform and can easily malfunction, causing localized incidents that could propagate throughout the entire network due to cascading effects. This type of operation will be penalized with the goal of selectively avoiding them and ameliorating their high malfunction probabilities. Critic trains, defined as train services that come through stations that have a large number of passengers arriving at the platform during rush hours, are also introduced. We illustrate our model using computational experiments drawn from RENFE (the main Spanish operator of suburban passenger trains) in Madrid, Spain. The results of the model, achieved in approximately 1 min, have been received positively by RENFE planners
Resumo:
Fractal and multifractal are concepts that have grown increasingly popular in recent years in the soil analysis, along with the development of fractal models. One of the common steps is to calculate the slope of a linear fit commonly using least squares method. This shouldn?t be a special problem, however, in many situations using experimental data the researcher has to select the range of scales at which is going to work neglecting the rest of points to achieve the best linearity that in this type of analysis is necessary. Robust regression is a form of regression analysis designed to circumvent some limitations of traditional parametric and non-parametric methods. In this method we don?t have to assume that the outlier point is simply an extreme observation drawn from the tail of a normal distribution not compromising the validity of the regression results. In this work we have evaluated the capacity of robust regression to select the points in the experimental data used trying to avoid subjective choices. Based on this analysis we have developed a new work methodology that implies two basic steps: ? Evaluation of the improvement of linear fitting when consecutive points are eliminated based on R pvalue. In this way we consider the implications of reducing the number of points. ? Evaluation of the significance of slope difference between fitting with the two extremes points and fitted with the available points. We compare the results applying this methodology and the common used least squares one. The data selected for these comparisons are coming from experimental soil roughness transect and simulated based on middle point displacement method adding tendencies and noise. The results are discussed indicating the advantages and disadvantages of each methodology.
Resumo:
The understanding of the embryogenesis in living systems requires reliable quantitative analysis of the cell migration throughout all the stages of development. This is a major challenge of the "in-toto" reconstruction based on different modalities of "in-vivo" imaging techniques -spatio-temporal resolution and image artifacts and noise. Several methods for cell tracking are available, but expensive manual interaction -time and human resources- is always required to enforce coherence. Because of this limitation it is necessary to restrict the experiments or assume an uncontrolled error rate. Is it possible to obtain automated reliable measurements of migration? can we provide a seed for biologists to complete cell lineages efficiently? We propose a filtering technique that considers trajectories as spatio-temporal connected structures that prunes out those that might introduce noise and false positives by using multi-dimensional morphological operators.