873 resultados para strategies analysis
Resumo:
An in-depth study, using simulations and covariance analysis, is performed to identify the optimal sequence of observations to obtain the most accurate orbit propagation. The accuracy of the results of an orbit determination/ improvement process depends on: tracklet length, number of observations, type of orbit, astrometric error, time interval between tracklets and observation geometry. The latter depends on the position of the object along its orbit and the location of the observing station. This covariance analysis aims to optimize the observation strategy taking into account the influence of the orbit shape, of the relative object-observer geometry and the interval between observations.
Resumo:
The Astronomical Institute of the University of Bern (AIUB) is conducting several search campaigns for space debris using optical sensors. The debris objects are discovered during systematic survey observations. In general, the result of a discovery consists in only a short observation arc, or tracklet, which is used to perform a first orbit determination in order to be able to observe t he object again in subsequent follow-up observations. The additional observations are used in the orbit improvement process to obtain accurate orbits to be included in a catalogue. In order to obtain the most accurate orbit within the time available it is necessary to optimize the follow-up observations strategy. In this paper an in‐depth study, using simulations and covariance analysis, is performed to identify the optimal sequence of follow-up observations to obtain the most accurate orbit propagation to be used for the space debris catalogue maintenance. The main factors that determine the accuracy of the results of an orbit determination/improvement process are: tracklet length, number of observations, type of orbit, astrometric error of the measurements, time interval between tracklets, and the relative position of the object along its orbit with respect to the observing station. The main aim of the covariance analysis is to optimize the follow-up strategy as a function of the object-observer geometry, the interval between follow-up observations and the shape of the orbit. This an alysis can be applied to every orbital regime but particular attention was dedicated to geostationary, Molniya, and geostationary transfer orbits. Finally the case with more than two follow-up observations and the influence of a second observing station are also analyzed.
Resumo:
The United States health care system faces significant challenges, particularly with problems of the uninsured and with the rising costs of care. These problems lead many to study and discuss strategies for reforming the health care system. Four different plans for ideal health care reform, set forth by notable scholars or organizations, are explained herein. Then, states within the United States are examined in terms of their recent efforts at health care reform. Those states proposing significant changes to their health care systems are analyzed—namely, Maine, Massachusetts, and Vermont. The strategies used in these three states are compared to the strategies laid out by the experts in order to determine which strategies are the most popular in current health care reform efforts among the states studied here. These strategies are totaled to find which organization's plan for ideal reform seems to be the most popular. The strategies of managed competition are shown to be the most popular strategies among these three state health care reforms, while the strategies of the single-payer plan discussed herein were the least popular. All three states seem to utilize strategies that build upon their previous health care system, rather than implementing strategies that completely replace the previous system. ^
Resumo:
A strategy of pre-hospital reduced dose fibrinolytic administration coupled with urgent coronary intervention (PCI) for patients with STEMI (FAST-PCI) has been found to be superior to primary PCI (PPCI) alone. A coordinated STEMI system-of-care that includes FAST-PCI might offer better outcomes than pre-hospital diagnosis and STEMI team activation followed by PPCI alone. We compared the in-hospital outcomes for patients treated with the FAST-PCI approach with outcomes for patients treated with the PPCI approach during a pause in the FAST-PCI protocol. In-hospital data for 253 STEMI patients (03/2003–12/2009), treated with FAST-PCI protocol were compared to 124 patients (12/2009–08/2011), treated with PPCI strategy alone. In-hospital mortality was the primary endpoint. Stroke, major bleeding, and reinfarction during index hospitalization were secondary endpoints. Comparing the strategies used during the two time intervals, in-hospital mortality was significantly lower with FAST-PCI than with PPCI (2.77% vs. 10.48%, p = 0.0017). Rates of stroke, reinfarction and major bleeding were similar between the two groups. There was a lower frequency of pre- PCI TIMI 0 flow (no patency) seen in patients treated with FAST-PCI compared to the PPCI patients (26.7% vs. 62.7%, p<0.0001). Earlier infarct related artery patency in the FAST-PCI group had a favorable impact on the incidence of cardiogenic shock at hospital admission (FAST-PCI- 3.1% vs. PPCI- 20.9%, p<0.0001). The FAST-PCI strategy was associated with earlier infarct related artery patency and the lower incidence of cardiogenic shock on hospital arrival, as well as with reduced in-hospital mortality among STEMI patients.^
Resumo:
In this paper, a fully automatic goal-oriented hp-adaptive finite element strategy for open region electromagnetic problems (radiation and scattering) is presented. The methodology leads to exponential rates of convergence in terms of an upper bound of an user-prescribed quantity of interest. Thus, the adaptivity may be guided to provide an optimal error, not globally for the field in the whole finite element domain, but for specific parameters of engineering interest. For instance, the error on the numerical computation of the S-parameters of an antenna array, the field radiated by an antenna, or the Radar Cross Section on given directions, can be minimized. The efficiency of the approach is illustrated with several numerical simulations with two dimensional problem domains. Results include the comparison with the previously developed energy-norm based hp-adaptivity.
Resumo:
Nitrate leaching (NL) is an important N loss process in irrigated agriculture that imposes a cost on the farmer and the environment. A meta-analysis of published experimental results from agricultural irrigated systems was conducted to identify those strategies that have proven effective at reducing NL and to quantify the scale of reduction that can be achieved. Forty-four scientific articles were identified which investigated four main strategies (water and fertilizer management, use of cover crops and fertilizer technology) creating a database with 279 observations on NL and 166 on crop yield. Management practices that adjust water application to crop needs reduced NL by a mean of 80% without a reduction in crop yield. Improved fertilizer management reduced NL by 40%, and the best relationship between yield and NL was obtained when applying the recommended fertilizer rate. Replacing a fallow with a non-legume cover crop reduced NL by 50% while using a legume did not have any effect on NL. Improved fertilizer technology also decreased NL but was the least effective of the selected strategies. The risk of nitrate leaching from irrigated systems is high, but optimum management practices may mitigate this risk and maintain crop yields while enhancing environmental sustainability.
Resumo:
A novel time-stepping shift-invert algorithm for linear stability analysis of laminar flows in complex geometries is presented. This method, based on a Krylov subspace iteration, enables the solution of complex non-symmetric eigenvalue problems in a matrix-free framework. Validations and comparisons to the classical exponential method have been performed in three different cases: (i) stenotic flow, (ii) backward-facing step and (iii) lid-driven swirling flow. Results show that this new approach speeds up the required Krylov subspace iterations and has the capability of converging to specific parts of the global spectrum. It is shown that, although the exponential method remains the method of choice if leading eigenvalues are sought, the performance of the present method could be dramatically improved with the use of a preconditioner. In addition, as opposed to other methods, this strategy can be directly applied to any time-stepper, regardless of the temporal or spatial discretization of the latter.
Resumo:
A wide variety of different socially motivated organisations perform exceptional feats in alleviating societal ills using creativity and entrepreneurial spirit in their quest to scale their social impact. Their focus is seldom limited to one specific region – instead they strive to benefit the largest possible number of people, thereby often transcending national borders. After all, societal issues usually don’t stop at a countries’ borders – why then should good ideas and impactful concepts aimed at solving these issues? For many socially motivated organisations, the goal of disseminating their social impact remains a solemn wish due to lacking experience and know-how of how to plan and conduct systematic scaling.
Resumo:
Transportation Department, Office of University Research, Washington, D.C.
Resumo:
Thesis (Master's)--University of Washington, 2016-06
Resumo:
The use of quantitative methods has become increasingly important in the study of neurodegenerative disease. Disorders such as Alzheimer's disease (AD) are characterized by the formation of discrete, microscopic, pathological lesions which play an important role in pathological diagnosis. This article reviews the advantages and limitations of the different methods of quantifying the abundance of pathological lesions in histological sections, including estimates of density, frequency, coverage, and the use of semiquantitative scores. The major sampling methods by which these quantitative measures can be obtained from histological sections, including plot or quadrat sampling, transect sampling, and point-quarter sampling, are also described. In addition, the data analysis methods commonly used to analyse quantitative data in neuropathology, including analyses of variance (ANOVA) and principal components analysis (PCA), are discussed. These methods are illustrated with reference to particular problems in the pathological diagnosis of AD and dementia with Lewy bodies (DLB).
Resumo:
Developers of interactive software are confronted by an increasing variety of software tools to help engineer the interactive aspects of software applications. Not only do these tools fall into different categories in terms of functionality, but within each category there is a growing number of competing tools with similar, although not identical, features. Choice of user interface development tool (UIDT) is therefore becoming increasingly complex.
Resumo:
Report published in the Proceedings of the National Conference on "Education and Research in the Information Society", Plovdiv, May, 2014