733 resultados para Smoothed bootstrap


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Climatic relationships were established in two 210Pb dated pollen sequences from small mires closely surrounded by forest just below actual forest limits (but about 300 m below potential climatic forest limits) in the northern Swiss Alps (suboceanic in climate; mainly with Picea) and the central Swiss Alps (subcontinental; mainly Pinus cembra and Larix) at annual or near-annual resolution from ad 1901 to 1996. Effects of vegetational succession were removed by splitting the time series into early and late periods and by linear detrending. Both pollen concentrations detrended by the depth-age model and modified percentages (in which counts of dominant pollen types are down-weighted) are correlated by simple linear regression with smoothed climatic parameters with one-and two-year timelags, including average monthly and April/September daylight air temperatures and with seasonal and annual precipitation sums. Results from detrended pollen concentrations suggest that peat accumulation is favoured in the northern-Alpine mire either by early snowmelt or by summer precipitation, but in the central-Alpine mire by increased precipitation and cooler summers, suggesting a position of the northern-Alpine mire near the upper altitudinal limit of peat formation, but of the central-Alpine mire near the lower limit. Results from modified pollen percentages indicate that pollen pro duction by plants growing near their upper altitudinal limit is limited by insufficient warmth in summer, and pollen production by plants growing near their lower altitudinal limit is limited by too-high temperatures. Only weakly significant pollen/climate relationships were found for Pinus cembra and Larix, probably because they experience little climatic stress growing 300 m below the potential climatic forest limit.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Charcoal particles in pollen slides are often abundant, and thus analysts are faced with the problem of setting the minimum counting sum as small as possible in order to save time. We analysed the reliability of charcoal-concentration estimates based on different counting sums, using simulated low-to high-count samples. Bootstrap simulations indicate that the variability of inferred charcoal concentrations increases progressively with decreasing sums. Below 200 items (i.e., the sum of charcoal particles and exotic marker grains), reconstructed fire incidence is either too high or too low. Statistical comparisons show that the means of bootstrap simulations stabilize after 200 counts. Moreover, a count of 200-300 items is sufficient to produce a charcoal-concentration estimate with less than+5% error if compared with high-count samples of 1000 items for charcoal/marker grain ratios 0.1-0.91. If, however, this ratio is extremely high or low (> 0.91 or < 0.1) and if such samples are frequent, we suggest that marker grains are reduced or added prior to new sample processing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Passive positioning systems produce user location information for third-party providers of positioning services. Since the tracked wireless devices do not participate in the positioning process, passive positioning can only rely on simple, measurable radio signal parameters, such as timing or power information. In this work, we provide a passive tracking system for WiFi signals with an enhanced particle filter using fine-grained power-based ranging. Our proposed particle filter provides an improved likelihood function on observation parameters and is equipped with a modified coordinated turn model to address the challenges in a passive positioning system. The anchor nodes for WiFi signal sniffing and target positioning use software defined radio techniques to extract channel state information to mitigate multipath effects. By combining the enhanced particle filter and a set of enhanced ranging methods, our system can track mobile targets with an accuracy of 1.5m for 50% and 2.3m for 90% in a complex indoor environment. Our proposed particle filter significantly outperforms the typical bootstrap particle filter, extended Kalman filter and trilateration algorithms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE In patients with a long life expectancy with high-risk (HR) prostate cancer (PCa), the chance to die from PCa is not negligible and may change significantly according to the time elapsed from surgery. The aim of this study was to evaluate long-term survival patterns in young patients treated with radical prostatectomy (RP) for HRPCa. MATERIALS AND METHODS Within a multiinstitutional cohort, 600 young patients (≤59 years) treated with RP between 1987 and 2012 for HRPCa (defined as at least one of the following adverse characteristics: prostate specific antigen>20, cT3 or higher, biopsy Gleason sum 8-10) were identified. Smoothed cumulative incidence plot was performed to assess cancer-specific mortality (CSM) and other cause mortality (OCM) rates at 10, 15, and 20 years after RP. The same analyses were performed to assess the 5-year probability of CSM and OCM in patients who survived 5, 10, and 15 years after RP. A multivariable competing risk regression model was fitted to identify predictors of CSM and OCM. RESULTS The 10-, 15- and 20-year CSM and OCM rates were 11.6% and 5.5% vs. 15.5% and 13.5% vs. 18.4% and 19.3%, respectively. The 5-year probability of CSM and OCM rates among patients who survived at 5, 10, and 15 years after RP, were 6.4% and 2.7% vs. 4.6% and 9.6% vs. 4.2% and 8.2%, respectively. Year of surgery, pathological stage and Gleason score, surgical margin status and lymph node invasion were the major determinants of CSM (all P≤0.03). Conversely, none of the covariates was significantly associated with OCM (all P≥ 0.09). CONCLUSIONS Very long-term cancer control in young high-risk patients after RP is highly satisfactory. The probability of dying from PCa in young patients is the leading cause of death during the first 10 years of survivorship after RP. Thereafter, mortality not related to PCa became the main cause of death. Consequently, surgery should be consider among young patients with high-risk disease and strict PCa follow-up should enforce during the first 10 years of survivorship after RP.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVES Improvement of skin fibrosis is part of the natural course of diffuse cutaneous systemic sclerosis (dcSSc). Recognising those patients most likely to improve could help tailoring clinical management and cohort enrichment for clinical trials. In this study, we aimed to identify predictors for improvement of skin fibrosis in patients with dcSSc. METHODS We performed a longitudinal analysis of the European Scleroderma Trials And Research (EUSTAR) registry including patients with dcSSc, fulfilling American College of Rheumatology criteria, baseline modified Rodnan skin score (mRSS) ≥7 and follow-up mRSS at 12±2 months. The primary outcome was skin improvement (decrease in mRSS of >5 points and ≥25%) at 1 year follow-up. A respective increase in mRSS was considered progression. Candidate predictors for skin improvement were selected by expert opinion and logistic regression with bootstrap validation was applied. RESULTS From the 919 patients included, 218 (24%) improved and 95 (10%) progressed. Eleven candidate predictors for skin improvement were analysed. The final model identified high baseline mRSS and absence of tendon friction rubs as independent predictors of skin improvement. The baseline mRSS was the strongest predictor of skin improvement, independent of disease duration. An upper threshold between 18 and 25 performed best in enriching for progressors over regressors. CONCLUSIONS Patients with advanced skin fibrosis at baseline and absence of tendon friction rubs are more likely to regress in the next year than patients with milder skin fibrosis. These evidence-based data can be implemented in clinical trial design to minimise the inclusion of patients who would regress under standard of care.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many attempts have already been made to detect exomoons around transiting exoplanets, but the first confirmed discovery is still pending. The experiences that have been gathered so far allow us to better optimize future space telescopes for this challenge already during the development phase. In this paper we focus on the forthcoming CHaraterising ExOPlanet Satellite (CHEOPS), describing an optimized decision algorithm with step-by-step evaluation, and calculating the number of required transits for an exomoon detection for various planet moon configurations that can be observable by CHEOPS. We explore the most efficient way for such an observation to minimize the cost in observing time. Our study is based on PTV observations (photocentric transit timing variation) in simulated CHEOPS data, but the recipe does not depend on the actual detection method, and it can be substituted with, e.g., the photodynamical method for later applications. Using the current state-of-the-art level simulation of CHEOPS data we analyzed transit observation sets for different star planet moon configurations and performed a bootstrap analysis to determine their detection statistics. We have found that the detection limit is around an Earth-sized moon. In the case of favorable spatial configurations, systems with at least a large moon and a Neptune-sized planet, an 80% detection chance requires at least 5-6 transit observations on average. There is also a nonzero chance in the case of smaller moons, but the detection statistics deteriorate rapidly, while the necessary transit measurements increase quickly. After the CoRoT and Kepler spacecrafts, CHEOPS will be the next dedicated space telescope that will observe exoplanetary transits and characterize systems with known Doppler-planets. Although it has a smaller aperture than Kepler (the ratio of the mirror diameters is about 1/3) and is mounted with a CCD that is similar to Kepler's, it will observe brighter stars and operate with larger sampling rate; therefore, the detection limit for an exomoon can be the same as or better, which will make CHEOPS a competitive instruments in the quest for exomoons.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Arterial spin labeling (ASL) is a technique for noninvasively measuring cerebral perfusion using magnetic resonance imaging. Clinical applications of ASL include functional activation studies, evaluation of the effect of pharmaceuticals on perfusion, and assessment of cerebrovascular disease, stroke, and brain tumor. The use of ASL in the clinic has been limited by poor image quality when large anatomic coverage is required and the time required for data acquisition and processing. This research sought to address these difficulties by optimizing the ASL acquisition and processing schemes. To improve data acquisition, optimal acquisition parameters were determined through simulations, phantom studies and in vivo measurements. The scan time for ASL data acquisition was limited to fifteen minutes to reduce potential subject motion. A processing scheme was implemented that rapidly produced regional cerebral blood flow (rCBF) maps with minimal user input. To provide a measure of the precision of the rCBF values produced by ASL, bootstrap analysis was performed on a representative data set. The bootstrap analysis of single gray and white matter voxels yielded a coefficient of variation of 6.7% and 29% respectively, implying that the calculated rCBF value is far more precise for gray matter than white matter. Additionally, bootstrap analysis was performed to investigate the sensitivity of the rCBF data to the input parameters and provide a quantitative comparison of several existing perfusion models. This study guided the selection of the optimum perfusion quantification model for further experiments. The optimized ASL acquisition and processing schemes were evaluated with two ASL acquisitions on each of five normal subjects. The gray-to-white matter rCBF ratios for nine of the ten acquisitions were within ±10% of 2.6 and none were statistically different from 2.6, the typical ratio produced by a variety of quantitative perfusion techniques. Overall, this work produced an ASL data acquisition and processing technique for quantitative perfusion and functional activation studies, while revealing the limitations of the technique through bootstrap analysis. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study examines the relationship between stock market reaction to horizontal merger announcements and technical efficiency levels of the participating firms. The analysis is based on data pertaining to eighty mergers between firms in the U.S. manufacturing industry during the 1990s. We employ Data Envelopment Analysis (DEA) to measure technical efficiency, which capture the firms. competence to produce the maximum output given certain productive resources. Abnormal returns related to the merger announcements provide the investor.s re-evaluation on the future performance of the participating firms. In order to avoid the problem of nonnormality, heteroskedasticity in the regression analysis, bootstrap method is employed for estimations and inferences. We found that there is a significant relationship between technical efficiency and market response. The market apparently welcomes the merger as an arrangement to improve resource utilizations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent years, disaster preparedness through assessment of medical and special needs persons (MSNP) has taken a center place in public eye in effect of frequent natural disasters such as hurricanes, storm surge or tsunami due to climate change and increased human activity on our planet. Statistical methods complex survey design and analysis have equally gained significance as a consequence. However, there exist many challenges still, to infer such assessments over the target population for policy level advocacy and implementation. ^ Objective. This study discusses the use of some of the statistical methods for disaster preparedness and medical needs assessment to facilitate local and state governments for its policy level decision making and logistic support to avoid any loss of life and property in future calamities. ^ Methods. In order to obtain precise and unbiased estimates for Medical Special Needs Persons (MSNP) and disaster preparedness for evacuation in Rio Grande Valley (RGV) of Texas, a stratified and cluster-randomized multi-stage sampling design was implemented. US School of Public Health, Brownsville surveyed 3088 households in three counties namely Cameron, Hidalgo, and Willacy. Multiple statistical methods were implemented and estimates were obtained taking into count probability of selection and clustering effects. Statistical methods for data analysis discussed were Multivariate Linear Regression (MLR), Survey Linear Regression (Svy-Reg), Generalized Estimation Equation (GEE) and Multilevel Mixed Models (MLM) all with and without sampling weights. ^ Results. Estimated population for RGV was 1,146,796. There were 51.5% female, 90% Hispanic, 73% married, 56% unemployed and 37% with their personal transport. 40% people attained education up to elementary school, another 42% reaching high school and only 18% went to college. Median household income is less than $15,000/year. MSNP estimated to be 44,196 (3.98%) [95% CI: 39,029; 51,123]. All statistical models are in concordance with MSNP estimates ranging from 44,000 to 48,000. MSNP estimates for statistical methods are: MLR (47,707; 95% CI: 42,462; 52,999), MLR with weights (45,882; 95% CI: 39,792; 51,972), Bootstrap Regression (47,730; 95% CI: 41,629; 53,785), GEE (47,649; 95% CI: 41,629; 53,670), GEE with weights (45,076; 95% CI: 39,029; 51,123), Svy-Reg (44,196; 95% CI: 40,004; 48,390) and MLM (46,513; 95% CI: 39,869; 53,157). ^ Conclusion. RGV is a flood zone, most susceptible to hurricanes and other natural disasters. People in the region are mostly Hispanic, under-educated with least income levels in the U.S. In case of any disaster people in large are incapacitated with only 37% have their personal transport to take care of MSNP. Local and state government’s intervention in terms of planning, preparation and support for evacuation is necessary in any such disaster to avoid loss of precious human life. ^ Key words: Complex Surveys, statistical methods, multilevel models, cluster randomized, sampling weights, raking, survey regression, generalized estimation equations (GEE), random effects, Intracluster correlation coefficient (ICC).^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Standardization is a common method for adjusting confounding factors when comparing two or more exposure category to assess excess risk. Arbitrary choice of standard population in standardization introduces selection bias due to healthy worker effect. Small sample in specific groups also poses problems in estimating relative risk and the statistical significance is problematic. As an alternative, statistical models were proposed to overcome such limitations and find adjusted rates. In this dissertation, a multiplicative model is considered to address the issues related to standardized index namely: Standardized Mortality Ratio (SMR) and Comparative Mortality Factor (CMF). The model provides an alternative to conventional standardized technique. Maximum likelihood estimates of parameters of the model are used to construct an index similar to the SMR for estimating relative risk of exposure groups under comparison. Parametric Bootstrap resampling method is used to evaluate the goodness of fit of the model, behavior of estimated parameters and variability in relative risk on generated sample. The model provides an alternative to both direct and indirect standardization method. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hierarchical linear growth model (HLGM), as a flexible and powerful analytic method, has played an increased important role in psychology, public health and medical sciences in recent decades. Mostly, researchers who conduct HLGM are interested in the treatment effect on individual trajectories, which can be indicated by the cross-level interaction effects. However, the statistical hypothesis test for the effect of cross-level interaction in HLGM only show us whether there is a significant group difference in the average rate of change, rate of acceleration or higher polynomial effect; it fails to convey information about the magnitude of the difference between the group trajectories at specific time point. Thus, reporting and interpreting effect sizes have been increased emphases in HLGM in recent years, due to the limitations and increased criticisms for statistical hypothesis testing. However, most researchers fail to report these model-implied effect sizes for group trajectories comparison and their corresponding confidence intervals in HLGM analysis, since lack of appropriate and standard functions to estimate effect sizes associated with the model-implied difference between grouping trajectories in HLGM, and also lack of computing packages in the popular statistical software to automatically calculate them. ^ The present project is the first to establish the appropriate computing functions to assess the standard difference between grouping trajectories in HLGM. We proposed the two functions to estimate effect sizes on model-based grouping trajectories difference at specific time, we also suggested the robust effect sizes to reduce the bias of estimated effect sizes. Then, we applied the proposed functions to estimate the population effect sizes (d ) and robust effect sizes (du) on the cross-level interaction in HLGM by using the three simulated datasets, and also we compared the three methods of constructing confidence intervals around d and du recommended the best one for application. At the end, we constructed 95% confidence intervals with the suitable method for the effect sizes what we obtained with the three simulated datasets. ^ The effect sizes between grouping trajectories for the three simulated longitudinal datasets indicated that even though the statistical hypothesis test shows no significant difference between grouping trajectories, effect sizes between these grouping trajectories can still be large at some time points. Therefore, effect sizes between grouping trajectories in HLGM analysis provide us additional and meaningful information to assess group effect on individual trajectories. In addition, we also compared the three methods to construct 95% confident intervals around corresponding effect sizes in this project, which handled with the uncertainty of effect sizes to population parameter. We suggested the noncentral t-distribution based method when the assumptions held, and the bootstrap bias-corrected and accelerated method when the assumptions are not met.^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The sedimentary archive from Laguna Potrok Aike is the only continuous record reaching back to the last Glacial period in continental southeastern Patagonia. Located in the path of the Southern Hemisphere westerly winds and in the source region of dust deposited in Antarctica during Glacial periods, southern Patagonia is a vantage point to reconstruct past changes in aeolian activity. Here we use high-resolution rock-magnetic and physical grain size data from site 2 of the International Continental scientific Drilling Program (ICDP) Potrok Aike maar lake Sediment Archive Drilling prOject (PASADO) in order to develop magnetic proxies of dust and wind intensity at 52°S since 51,200 cal BP. Rock-magnetic analysis indicate the magnetic mineral assemblage is dominated by detrital magnetite. Based on the estimated flux of magnetite to the lake and comparison with distal dust records from the Southern Ocean and Antarctica, kLF is interpreted as a dust indicator in the dust source of southern Patagonia at the millennial time scale, when ferrimagnetic grain size and coercivity influence is minimal. Comparison to physical grain-size data indicates that the median destructive field of isothermal remanent magnetisation (MDFIRM) mostly reflects medium to coarse magnetite bearing silts typically transported by winds for short-term suspension. Comparison with wind-intensity proxies from the Southern Hemisphere during the last Glacial period and with regional records from Patagonia since the last deglaciation including marine, lacustrine and peat bog sediments as well as speleothems reveals similar variability with MDFIRM up to the centennial time scale. MDFIRM is interpreted as a wind-intensity proxy independent of moisture changes for southeastern Patagonia, with stronger winds capable of transporting coarser magnetite bearing silts to the lake.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this study we present combined high-resolution records of sea surface temperature (SST), phytoplankton productivity, and nutrient cycling in the Benguela Upwelling System (BUS) for the past 3.5 Ma. The SST record provided evidence that upwelling activity off Namibia mainly intensified ca. 2.4-2.0 Ma ago in response to the cooling of the Southern Ocean and the resultant strengthening of trade winds. As revealed by productivity-related proxies, BUS intensification led to a major transition in regional biological productivity when considering the termination of the Matuyama Diatom Maximum (a diatom high-production event). Major oceanic reorganization in the Benguela was accompanied by nutrient source changes, as indicated by a new nitrogen isotopic (delta15N) record that revealed a stepwise increase at ca. 2.4 and ca. 1.5 Ma ago. The change in source region likely resulted from significant changes in intermediate water formation tied to the reorganization of oceanic conditions in the Southern Ocean, which may have in turn mainly controlled the global ocean N cycle, and therefore the N isotopic composition of nutrients since 3.5 Ma ago.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Eight deep-sea sediment cores from the North Atlantic Ocean ranging from 31° to 72°N are studied to reconstruct the meridional gradients in surface hydrographic conditions during the interval of minimum ice volume within the last interglacial period. Using benthic foraminiferal ?18O measurements and estimates of Sea Surface Temperature (SST) and Sea Surface Salinity (SSS), we show that summer SSTs and SSSs decreased gradually during the interval of minimum ice volume at high-latitude sites (52°-72°N) whereas they were stable or increased during the same time period at low-latitude sites (31°-41°N). This increase in meridional gradients of SSTs and SSSs may have been due to changes in the latitudinal distribution of summer and annual-average insolation and associated oceanic and atmospheric feedbacks. These trends documented for the Eemian ice volume minimum period are similar to corresponding changes observed during the Holocene and may have had a similar origin.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

One of the most abrupt and yet unexplained past rises in atmospheric CO2 (10 p.p.m.v. in two centuries) occurred in quasi-synchrony with abrupt northern hemispheric warming into the Bølling/Allerød, 14,600 years ago. Here we use a U/Th-dated record of atmospheric D14C from Tahiti corals to provide an independent and precise age control for this CO2 rise. We also use model simulations to show that the release of old (nearly 14C-free) carbon can explain these changes in CO2 and D14C. The D14C record provides an independent constraint on the amount of carbon released (125 Pg C). We suggest, in line with observations of atmospheric CH4 and terrigenous biomarkers, that thawing permafrost in high northern latitudes could have been the source of carbon, possibly with contribution from flooding of the Siberian continental shelf during meltwater pulse 1A. Our findings highlight the potential of the permafrost carbon reservoir to modulate abrupt climate changes via greenhouse-gas feedbacks.