956 resultados para Interior point methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper evaluates the performances of prediction intervals generated from alternative time series models, in the context of tourism forecasting. The forecasting methods considered include the autoregressive (AR) model, the AR model using the bias-corrected bootstrap, seasonal ARIMA models, innovations state space models for exponential smoothing, and Harvey’s structural time series models. We use thirteen monthly time series for the number of tourist arrivals to Hong Kong and Australia. The mean coverage rates and widths of the alternative prediction intervals are evaluated in an empirical setting. It is found that all models produce satisfactory prediction intervals, except for the autoregressive model. In particular, those based on the biascorrected bootstrap perform best in general, providing tight intervals with accurate coverage rates, especially when the forecast horizon is long.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In January 2011 a swollen Brisbane River broke its banks flooding riverside houses and buildings. The river’s water spread and rose up through storm water drains inundating some 20 000 houses in low-lying land. As the water receded those residents affected by the floods returned to their homes to assess the damage. While some people breathed a sigh of relief others were devastated by the overwhelming damage to their homes and personal belongings. Over the next few weeks the landscape of Brisbane was altered not merely by the mud and debris left by the torrent of water, but by the piles of domestic contents occupying Brisbane streets. Beds, toys, cabinets, plasterboard, tiles and household furniture lined curbsides waiting for collection. Later they would accumulate in public parks and sports centres to await disposal, momentarily creating an unsettling landscape of discarded domestic interiors. While most houses remained standing the heart breaking repercussions were evident in their interiority. Thousands of volunteers flocked to help those affected by the floods to purge the damage left by the water – removing wall and floor linings, discarding furniture and spoilt belongings. In her paper on Hurricane Katrina, Julieanna Preston wrote, ‘What anthropological evidence would we find as we followed their migration – heaps left by the side of the road, the physical weight overcoming the personal value…’ For many of the post flood restored homes and buildings entire interiors have been replaced, eradicating any trace of the significant event that disturbed them only months earlier. There were artifacts that would have survived the floods - furniture of solid timber – these were discarded and with them the patina that marked an important event in history. The patina is beyond technological reproducibility, and as Walter Benjamin writes, this being the whole premise of genuineness. It is the role of the French Polisher to maintain the true wear of the artefact for it is the patina that is most valuable in its ability to narrate the history of a piece. In 2012 two separate exhibitions in Brisbane will take place to display a selected collection of flood-damaged artefacts. This orchestrated way to commemorate the damage left by floods may be a method to compensate for the haste in which the damage was purged from the city. This need for exhibiting damaged artifacts illustrates Andreas Huyssen’s point that "…today memory is understood as a mode of re-presentation and as belonging to the present." This research looks at the dying trade of the French Polisher through conversations and a visual study of flood damaged furniture. The research also investigates the personal loss of artifacts through intimate stories shared by flood victims. This paper seeks to understand why so much was discarded and celebrate what remains.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thin plate spline finite element methods are used to fit a surface to an irregularly scattered dataset [S. Roberts, M. Hegland, and I. Altas. Approximation of a Thin Plate Spline Smoother using Continuous Piecewise Polynomial Functions. SIAM, 1:208--234, 2003]. The computational bottleneck for this algorithm is the solution of large, ill-conditioned systems of linear equations at each step of a generalised cross validation algorithm. Preconditioning techniques are investigated to accelerate the convergence of the solution of these systems using Krylov subspace methods. The preconditioners under consideration are block diagonal, block triangular and constraint preconditioners [M. Benzi, G. H. Golub, and J. Liesen. Numerical solution of saddle point problems. Acta Numer., 14:1--137, 2005]. The effectiveness of each of these preconditioners is examined on a sample dataset taken from a known surface. From our numerical investigation, constraint preconditioners appear to provide improved convergence for this surface fitting problem compared to block preconditioners.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE Every health care sector including hospice/palliative care needs to systematically improve services using patient-defined outcomes. Data from the national Australian Palliative Care Outcomes Collaboration aims to define whether hospice/palliative care patients' outcomes and the consistency of these outcomes have improved in the last 3 years. METHODS Data were analysed by clinical phase (stable, unstable, deteriorating, terminal). Patient-level data included the Symptom Assessment Scale and the Palliative Care Problem Severity Score. Nationally collected point-of-care data were anchored for the period July-December 2008 and subsequently compared to this baseline in six 6-month reporting cycles for all services that submitted data in every time period (n = 30) using individual longitudinal multi-level random coefficient models. RESULTS Data were analysed for 19,747 patients (46 % female; 85 % cancer; 27,928 episodes of care; 65,463 phases). There were significant improvements across all domains (symptom control, family care, psychological and spiritual care) except pain. Simultaneously, the interquartile ranges decreased, jointly indicating that better and more consistent patient outcomes were being achieved. CONCLUSION These are the first national hospice/palliative care symptom control performance data to demonstrate improvements in clinical outcomes at a service level as a result of routine data collection and systematic feedback.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: It is important for nutrition intervention in malnourished patients to be guided by accurate evaluation and detection of small changes in the patient’s nutrition status over time. However, the current Subjective Global Assessment (SGA) is not able to detect changes in a short period of time. The aim of the study was to determine whether 7-point SGA is more time sensitive to nutrition changes than the conventional SGA. Methods: In this prospective study, 67 adult inpatients assessed as malnourished using both the 7-point SGA and conventional SGA were recruited. Each patient received nutrition intervention and was followed up post-discharge. Patients were reassessed using both tools at 1, 3 and 5 months from baseline assessment. Results: It took significantly shorter time to see a one-point change using 7-point SGA compared to conventional SGA (median: 1 month vs. 3 months, p = 0.002). The likelihood of at least a one-point change is 6.74 times greater in 7-point SGA compared to conventional SGA after controlling for age, gender and medical specialties (odds ratio = 6.74, 95% CI 2.88-15.80, p<0.001). Fifty-six percent of patients who had no change in SGA score had changes detected using 7-point SGA. The level of agreement was 100% (k = 1, p < 0.001) between 7-point SGA and 3-point SGA and 83% (k=0.726, p<0.001) between two blinded assessors for 7-point SGA. Conclusion: The 7-point SGA is more time sensitive in its response to nutrition changes than conventional SGA. It can be used to guide nutrition intervention for patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper examines the feasibility of using vertical light pipes to naturally illuminate the central core of a multilevel building not reached by window light. The challenges addressed were finding a method to extract and distribute equal amounts of light at each level and designing collectors to improve the effectiveness of vertical light pipes in delivering low elevation sunlight to the interior. Extraction was achieved by inserting partially reflecting cones within transparent sections of the pipes at each floor level. Theory was formulated to estimate the partial reflectance necessary to provide equal light extraction at each level. Designs for daylight collectors formed from laser cut panels tilted above the light pipe were developed and the benefits and limitations of static collectors as opposed to collectors that follow the sun azimuth investigated. Performance was assessed with both basic and detailed mathematical simulation and by observations made with a five level model building under clear sky conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background In the emergency department, portable point-of-care testing (POCT) coagulation devices may facilitate stroke patient care by providing rapid International Normalized Ratio (INR) measurement. The objective of this study was to evaluate the reliability, validity, and impact on clinical decision-making of a POCT device for INR testing in the setting of acute ischemic stroke (AIS). Methods A total of 150 patients (50 healthy volunteers, 51 anticoagulated patients, 49 AIS patients) were assessed in a tertiary care facility. The INR's were measured using the Roche Coaguchek S and the standard laboratory technique. Results The interclass correlation coefficient and 95% confidence interval between overall POCT device and standard laboratory value INRs was high (0.932 (0.69 - 0.78). In the AIS group alone, the correlation coefficient and 95% CI was also high 0.937 (0.59 - 0.74) and diagnostic accuracy of the POCT device was 94%. Conclusions When used by a trained health professional in the emergency department to assess INR in acute ischemic stroke patients, the CoaguChek S is reliable and provides rapid results. However, as concordance with laboratory INR values decreases with higher INR values, it is recommended that with CoaguChek S INRs in the > 1.5 range, a standard laboratory measurement be used to confirm the results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Whole genome sequences are generally accepted as excellent tools for studying evolutionary relationships. Due to the problems caused by the uncertainty in alignment, existing tools for phylogenetic analysis based on multiple alignments could not be directly applied to the whole-genome comparison and phylogenomic studies. There has been a growing interest in alignment-free methods for phylogenetic analysis using complete genome data. The “distances” used in these alignment-free methods are not proper distance metrics in the strict mathematical sense. In this study, we first review them in a more general frame — dissimilarity. Then we propose some new dissimilarities for phylogenetic analysis. Last three genome datasets are employed to evaluate these dissimilarities from a biological point of view.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reconstructing 3D motion data is highly under-constrained due to several common sources of data loss during measurement, such as projection, occlusion, or miscorrespondence. We present a statistical model of 3D motion data, based on the Kronecker structure of the spatiotemporal covariance of natural motion, as a prior on 3D motion. This prior is expressed as a matrix normal distribution, composed of separable and compact row and column covariances. We relate the marginals of the distribution to the shape, trajectory, and shape-trajectory models of prior art. When the marginal shape distribution is not available from training data, we show how placing a hierarchical prior over shapes results in a convex MAP solution in terms of the trace-norm. The matrix normal distribution, fit to a single sequence, outperforms state-of-the-art methods at reconstructing 3D motion data in the presence of significant data loss, while providing covariance estimates of the imputed points.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The widespread and increasing resistance of internal parasites to anthelmintic control is a serious problem for the Australian sheep and wool industry. As part of control programmes, laboratories use the Faecal Egg Count Reduction Test (FECRT) to determine resistance to anthelmintics. It is important to have confidence in the measure of resistance, not only for the producer planning a drenching programme but also for companies investigating the efficacy of their products. The determination of resistance and corresponding confidence limits as given in anthelmintic efficacy guidelines of the Standing Committee on Agriculture (SCA) is based on a number of assumptions. This study evaluated the appropriateness of these assumptions for typical data and compared the effectiveness of the standard FECRT procedure with the effectiveness of alternative procedures. Several sets of historical experimental data from sheep and goats were analysed to determine that a negative binomial distribution was a more appropriate distribution to describe pre-treatment helminth egg counts in faeces than a normal distribution. Simulated egg counts for control animals were generated stochastically from negative binomial distributions and those for treated animals from negative binomial and binomial distributions. Three methods for determining resistance when percent reduction is based on arithmetic means were applied. The first was that advocated in the SCA guidelines, the second similar to the first but basing the variance estimates on negative binomial distributions, and the third using Wadley’s method with the distribution of the response variate assumed negative binomial and a logit link transformation. These were also compared with a fourth method recommended by the International Co-operation on Harmonisation of Technical Requirements for Registration of Veterinary Medicinal Products (VICH) programme, in which percent reduction is based on the geometric means. A wide selection of parameters was investigated and for each set 1000 simulations run. Percent reduction and confidence limits were then calculated for the methods, together with the number of times in each set of 1000 simulations the theoretical percent reduction fell within the estimated confidence limits and the number of times resistance would have been said to occur. These simulations provide the basis for setting conditions under which the methods could be recommended. The authors show that given the distribution of helminth egg counts found in Queensland flocks, the method based on arithmetic not geometric means should be used and suggest that resistance be redefined as occurring when the upper level of percent reduction is less than 95%. At least ten animals per group are required in most circumstances, though even 20 may be insufficient where effectiveness of the product is close to the cut off point for defining resistance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aerial surveys of kangaroos (Macropus spp.) in Queensland are used to make economically important judgements on the levels of viable commercial harvest. Previous analysis methods for aerial kangaroo surveys have used both mark-recapture methodologies and conventional distance-sampling analyses. Conventional distance sampling has the disadvantage that detection is assumed to be perfect on the transect line, while mark-recapture methods are notoriously sensitive to problems with unmodelled heterogeneity in capture probabilities. We introduce three methodologies for combining together mark-recapture and distance-sampling data, aimed at exploiting the strengths of both methodologies and overcoming the weaknesses. Of these methods, two are based on the assumption of full independence between observers in the mark-recapture component, and this appears to introduce more bias in density estimation than it resolves through allowing uncertain trackline detection. Both of these methods give lower density estimates than conventional distance sampling, indicating a clear failure of the independence assumption. The third method, termed point independence, appears to perform very well, giving credible density estimates and good properties in terms of goodness-of-fit and percentage coefficient of variation. Estimated densities of eastern grey kangaroos range from 21 to 36 individuals km-2, with estimated coefficients of variation between 11% and 14% and estimated trackline detection probabilities primarily between 0.7 and 0.9.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Short-time analytical solutions of temperature and moving boundary in two-dimensional two-phase freezing due to a cold spot are presented in this paper. The melt occupies a semi-infinite region. Although the method of solution is valid for various other types of boundary conditions, the results in this paper are given only for the prescribed flux boundary conditions which could be space and time dependent. The freezing front propagations along the interior of the melt region exhibit well known behaviours but the propagations along the surface are of new type. The freezing front always depends on material parameters. Several interesting results can be obtained as particular cases of the general results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis, two separate single nucleotide polymorphism (SNP) genotyping techniques were set up at the Finnish Genome Center, pooled genotyping was evaluated as a screening method for large-scale association studies, and finally, the former approaches were used to identify genetic factors predisposing to two distinct complex diseases by utilizing large epidemiological cohorts and also taking environmental factors into account. The first genotyping platform was based on traditional but improved restriction-fragment-length-polymorphism (RFLP) utilizing 384-microtiter well plates, multiplexing, small reaction volumes (5 µl), and automated genotype calling. We participated in the development of the second genotyping method, based on single nucleotide primer extension (SNuPeTM by Amersham Biosciences), by carrying out the alpha- and beta tests for the chemistry and the allele-calling software. Both techniques proved to be accurate, reliable, and suitable for projects with thousands of samples and tens of markers. Pooled genotyping (genotyping of pooled instead of individual DNA samples) was evaluated with Sequenom s MassArray MALDI-TOF, in addition to SNuPeTM and PCR-RFLP techniques. We used MassArray mainly as a point of comparison, because it is known to be well suited for pooled genotyping. All three methods were shown to be accurate, the standard deviations between measurements being 0.017 for the MassArray, 0.022 for the PCR-RFLP, and 0.026 for the SNuPeTM. The largest source of error in the process of pooled genotyping was shown to be the volumetric error, i.e., the preparation of pools. We also demonstrated that it would have been possible to narrow down the genetic locus underlying congenital chloride diarrhea (CLD), an autosomal recessive disorder, by using the pooling technique instead of genotyping individual samples. Although the approach seems to be well suited for traditional case-control studies, it is difficult to apply if any kind of stratification based on environmental factors is needed. Therefore we chose to continue with individual genotyping in the following association studies. Samples in the two separate large epidemiological cohorts were genotyped with the PCR-RFLP and SNuPeTM techniques. The first of these association studies concerned various pregnancy complications among 100,000 consecutive pregnancies in Finland, of which we genotyped 2292 patients and controls, in addition to a population sample of 644 blood donors, with 7 polymorphisms in the potentially thrombotic genes. In this thesis, the analysis of a sub-study of pregnancy-related venous thromboses was included. We showed that the impact of factor V Leiden polymorphism on pregnancy-related venous thrombosis, but not the other tested polymorphisms, was fairly large (odds ratio 11.6; 95% CI 3.6-33.6), and increased multiplicatively when combined with other risk factors such as obesity or advanced age. Owing to our study design, we were also able to estimate the risks at the population level. The second epidemiological cohort was the Helsinki Birth Cohort of men and women who were born during 1924-1933 in Helsinki. The aim was to identify genetic factors that might modify the well known link between small birth size and adult metabolic diseases, such as type 2 diabetes and impaired glucose tolerance. Among ~500 individuals with detailed birth measurements and current metabolic profile, we found that an insertion/deletion polymorphism of the angiotensin converting enzyme (ACE) gene was associated with the duration of gestation, and weight and length at birth. Interestingly, the ACE insertion allele was also associated with higher indices of insulin secretion (p=0.0004) in adult life, but only among individuals who were born small (those among the lowest third of birth weight). Likewise, low birth weight was associated with higher indices of insulin secretion (p=0.003), but only among carriers of the ACE insertion allele. The association with birth measurements was also found with a common haplotype of the glucocorticoid receptor (GR) gene. Furthermore, the association between short length at birth and adult impaired glucose tolerance was confined to carriers of this haplotype (p=0.007). These associations exemplify the interaction between environmental factors and genotype, which, possibly due to altered gene expression, predisposes to complex metabolic diseases. Indeed, we showed that the common GR gene haplotype associated with reduced mRNA expression in thymus of three individuals (p=0.0002).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Printing papers have been the main product of the Finnish paper industry. To improve properties and economy of printing papers, controlling of tracheid cross-sectional dimensions and wood viscoelasticity are examined in this study. Controlling is understood as any procedure which yields raw material classes with distinct properties and small internal variation. Tracheid cross-sectional dimensions, i.e., cell wall thickness and radial and tangential diameters can be controlled with methods such as sorting wood into pulpwood and sawmill chips, sorting of logs according to tree social status and fractionation of fibres. These control methods were analysed in this study with simulations, which were based on measured tracheid cross-sectional dimensions. A SilviScan device was used to measure the data set from five Norway spruce (Picea abies) and five Scots pine (Pinus sylvestris) trunks. The simulation results indicate that the sawmill chips and top pulpwood assortments have quite similar cross-sectional dimensions. Norway spruce and Scots pine are on average also relatively similar in their cross-sectional dimensions. The distributions of these species are somewhat different, but from a practical point of view, the differences are probably of minor importance. The controlling of tracheid cross-sectional dimensions can be done most efficiently with methods that can separate fibres into earlywood and latewood. Sorting of logs or partitioning of logs into juvenile and mature wood were markedly less efficient control methods than fractionation of fibres. Wood viscoelasticity affects energy consumption in mechanical pulping, and is thus an interesting control target when improving energy efficiency of the process. A literature study was made to evaluate the possibility of using viscoelasticity in controlling. The study indicates that there is considerable variation in viscoelastic properties within tree species, but unfortunately, the viscoelastic properties of important raw material lots such as top pulpwood or sawmill chips are not known. Viscoelastic properties of wood depend mainly on lignin, but also on microfibrillar angle, width of cellulose crystals and tracheid cross-sectional dimensions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modifications of surface materials and their effects on cleanability have important impacts in many fields of activity. In this study the primary aim was to develop radiochemical methods suitable for evaluating cleanability in material research for different environments. Another aim was to investigate the effects of surface modifications on cleanabilitity and surface properties of plastics, ceramics, concrete materials and also their coatings in conditions simulating their typical environments. Several new 51Cr and 14C labelled soils were developed for testing situations. The new radiochemical methods developed were suitable for examining different surface materials and different soil types, providing quantitative information about the amount of soil on surfaces. They also take into account soil soaked into surfaces. The supporting methods colorimetric determination and ATP bioluminescence provided semi-quantitative results. The results from the radiochemical and supporting methods partly correlated with each other. From a material research point of view numerous new materials were evaluated. These included both laboratory-made model materials and commercial products. Increasing the amount of plasticizer decreased the cleanability of poly(vinyl chloride) (PVC) materials. Microstructured surfaces of plastics improved the cleanability of PVC from particle soils, whereas for oil soil microstructuring reduced the cleanability. In the case of glazed ceramic materials, coatings affected the cleanability. The roughness of surfaces correlated with cleanability from particle soils and the cleanability from oil soil correlated with the contact angles. Organic particle soil was removed more efficiently from TiO2-coated ceramic surfaces after UV-radiation than without UV treatment, whereas no effect was observed on the cleanability of oil soil. Coatings improved the cleanability of concrete flooring materials intended for use in animal houses.