888 resultados para PUB CLOSING TIMES


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The spatial context is critical when assessing present-day climate anomalies, attributing them to potential forcings and making statements regarding their frequency and severity in a long-term perspective. Recent international initiatives have expanded the number of high-quality proxy-records and developed new statistical reconstruction methods. These advances allow more rigorous regional past temperature reconstructions and, in turn, the possibility of evaluating climate models on policy-relevant, spatio-temporal scales. Here we provide a new proxy-based, annually-resolved, spatial reconstruction of the European summer (June–August) temperature fields back to 755 CE based on Bayesian hierarchical modelling (BHM), together with estimates of the European mean temperature variation since 138 BCE based on BHM and composite-plus-scaling (CPS). Our reconstructions compare well with independent instrumental and proxy-based temperature estimates, but suggest a larger amplitude in summer temperature variability than previously reported. Both CPS and BHM reconstructions indicate that the mean 20th century European summer temperature was not significantly different from some earlier centuries, including the 1st, 2nd, 8th and 10th centuries CE. The 1st century (in BHM also the 10th century) may even have been slightly warmer than the 20th century, but the difference is not statistically significant. Comparing each 50 yr period with the 1951–2000 period reveals a similar pattern. Recent summers, however, have been unusually warm in the context of the last two millennia and there are no 30 yr periods in either reconstruction that exceed the mean average European summer temperature of the last 3 decades (1986–2015 CE). A comparison with an ensemble of climate model simulations suggests that the reconstructed European summer temperature variability over the period 850–2000 CE reflects changes in both internal variability and external forcing on multi-decadal time-scales. For pan-European temperatures we find slightly better agreement between the reconstruction and the model simulations with high-end estimates for total solar irradiance. Temperature differences between the medieval period, the recent period and the Little Ice Age are larger in the reconstructions than the simulations. This may indicate inflated variability of the reconstructions, a lack of sensitivity and processes to changes in external forcing on the simulated European climate and/or an underestimation of internal variability on centennial and longer time scales.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Jacques Bigart

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Analysis of recurrent events has been widely discussed in medical, health services, insurance, and engineering areas in recent years. This research proposes to use a nonhomogeneous Yule process with the proportional intensity assumption to model the hazard function on recurrent events data and the associated risk factors. This method assumes that repeated events occur for each individual, with given covariates, according to a nonhomogeneous Yule process with intensity function λx(t) = λ 0(t) · exp( x′β). One of the advantages of using a non-homogeneous Yule process for recurrent events is that it assumes that the recurrent rate is proportional to the number of events that occur up to time t. Maximum likelihood estimation is used to provide estimates of the parameters in the model, and a generalized scoring iterative procedure is applied in numerical computation. ^ Model comparisons between the proposed method and other existing recurrent models are addressed by simulation. One example concerning recurrent myocardial infarction events compared between two distinct populations, Mexican-American and Non-Hispanic Whites in the Corpus Christi Heart Project is examined. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Harry H. Johnston

Relevância:

20.00% 20.00%

Publicador:

Resumo:

by Louis Ginzberg ; translated from the German manuscript by Henrietta Szold

Relevância:

20.00% 20.00%

Publicador:

Resumo:

by Louis Ginzberg ; translated from the German manuscript by Henrietta Szold

Relevância:

20.00% 20.00%

Publicador:

Resumo:

by Louis Ginzberg ; translated from the German manuscript by Paul Radin

Relevância:

20.00% 20.00%

Publicador:

Resumo:

by Louis Ginzberg

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Emergency Departments (EDs) and Emergency Rooms (ERs) are designed to manage trauma, respond to disasters, and serve as the initial care for those with serious illnesses. However, because of many factors, the ED has become the doorway to the hospital and a “catch-all net” for patients including those with non-urgent needs. This increase in the population in the ED has lead to an increase in wait times for patients. It has been well documented that there has been a constant and consistent rise in the number of patients that frequent the ED (National Center for Health Statistics, 2002); the wait time for patients in the ED has increased (Pitts, Niska, Xu, & Burt, 2008); and the cost of the treatment in the ER has risen (Everett Clinic, 2008). Because the ED was designed to treat patients who need quick diagnoses and may be in potential life-threatening circumstances, management of time can be the ultimate enemy. If a system was implemented to decrease wait times in the ED, decrease the use of ED resources, and decrease costs endured by patients seeking care, better outcomes for patients and patient satisfaction could be achieved. The goal of this research was to explore potential changes and/or alternatives to relieve the burden endured by the ED. In order to explore these options, data was collected by conducting one-on-one interviews with seven physicians closely tied to a Level 1 ED (Emergency Room physicians, Trauma Surgeons and Primary Care physicians). A qualitative analysis was performed on the responses of one-on-one interviews with the aforementioned physicians. The interviews were standardized, open-ended questions that probe what makes an effective ED, possible solutions to improving patient care in the ED, potential remedies for the mounting problems that plague the ED, and the feasibility of bringing Primary Care Physicians to the ED to decrease the wait times experienced by the patient. From the responses, it is clear that there needs to be more research in this area, several areas need to be addressed, and a variety of solutions could be implemented. The most viable option seems to be making the ED its own entity (similar to the clinic or hospital) that includes urgent clinics as a part of the system, in which triage and better staffing would be the most integral part of its success.^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

During the healthcare reform debate in the United States in 2009/2010, many health policy experts expressed a concern that expanding coverage would increase waiting times for patients to obtain care. Many complained that delays in obtaining care in turn would compromise the quality of healthcare in the United States. Using data from The Commonwealth Fund 2010 International Health Policy Survey in Eleven Countries, this study explored the relationship between wait times and quality of care, employing a wait time scale and several quality of care indicators present in the dataset. The impact of wait times on quality was assessed. Increased wait time was expected to reduce quality of care. However, this study found that wait times correlated with better health outcomes for some measures, and had no association with others. Since this is a pilot study and statistical significance was not achieved for any of the correlations, further research is needed to confirm and deepen the findings. However, if future studies confirm this finding, an emphasis on reducing wait times at the expense of other health system level performance variables may be inappropriate. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this research is to develop a new statistical method to determine the minimum set of rows (R) in a R x C contingency table of discrete data that explains the dependence of observations. The statistical power of the method will be empirically determined by computer simulation to judge its efficiency over the presently existing methods. The method will be applied to data on DNA fragment length variation at six VNTR loci in over 72 populations from five major racial groups of human (total sample size is over 15,000 individuals; each sample having at least 50 individuals). DNA fragment lengths grouped in bins will form the basis of studying inter-population DNA variation within the racial groups are significant, will provide a rigorous re-binning procedure for forensic computation of DNA profile frequencies that takes into account intra-racial DNA variation among populations. ^