44 resultados para Recursive Partitioning and Regression Trees (RPART)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background & aims: Little is known about energy requirements in brain injured (TBI) patients, despite evidence suggesting adequate nutritional support can improve clinical outcomes. The study aim was to compare predicted energy requirements with measured resting energy expenditure (REE) values, in patients recovering from TBI.

Methods: Indirect calorimetry (IC) was used to measure REE in 45 patients with TBI. Predicted energy requirements were determined using FAO/WHO/UNU and Harris–Benedict (HB) equations. Bland– Altman and regression analysis were used for analysis.

Results: One-hundred and sixty-seven successful measurements were recorded in patients with TBI. At an individual level, both equations predicted REE poorly. The mean of the differences of standardised areas of measured REE and FAO/WHO/UNU was near zero (9 kcal) but the variation in both directions was substantial (range 591 to þ573 kcal). Similarly, the differences of areas of measured REE and HB demonstrated a mean of 1.9 kcal and range 568 to þ571 kcal. Glasgow coma score, patient status, weight and body temperature were signi?cant predictors of measured REE (p < 0.001; R2= 0.47).

Conclusions: Clinical equations are poor predictors of measured REE in patients with TBI. The variability in REE is substantial. Clinicians should be aware of the limitations of prediction equations when estimating energy requirements in TBI patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The project comprises of the re-ordering and extension of a 19th century country house in the extreme south west of Ireland. The original house is what can be termed an Irish house of the middle size. A common typology in 19th century Ireland the classical house of the middle size is characterised by a highly ordered plan containing a variety of rooms within a square or rectangular form. A strategy of elaborating the threshold between the reception rooms of the house and the garden was adopted by wrapping the house in a notional forest of columns creating deep verandas to the south and west
of the main living spaces. The grid of structural columns derived its proportions directly from the house. The columns became analogous with the mature oak and pine trees in the garden beyond while the floor and ceiling were considered as landscapes in their own right, with the black floor forming hearth stone, kitchen island and basement cellar and the concrete roof inflected to hold roof lights, a chimney and a landscape of pleasure on the roof above.

Aims / Objectives / Questions
1To restore and extend a “house of the middle size”, a historic Irish typology, in a sympathetic manner.
2To address the new build accommodation in a sustainable manner through strategies associated with orientation, micro climates, materiality and engineering both mechanical and structural.
3To explore and develop an understanding for two spatial orders, the enfilade room and non directional space of the grid.
4The creation of deep threshold space.
5Marbling as a finish in fair faced concrete
6Concrete as a sustainable building material

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tuberculosis (TB) caused by Mycobacterium bovis is a re-emerging disease of livestock that is of major economic importance worldwide, as well as being a zoonotic risk there is significant heritability for host resistance to bovine TB (bTB) in dairy cattle. To identify resistance loci for bTB, we undertook a genome-wide association study in female Holstein-Friesian cattle with 592 cases and 559 age-matched controls from case herds. Cases and controls were categorised into distinct phenotypes: skin test and lesion positive vs skin test negative on multiple occasions, respectively these animals were genotyped with the Illumina BovineHD 700K BeadChip. Genome-wide rapid association using linear and logistic mixed models and regression (GRAMMAR), regional heritability mapping (RHM) and haplotype-sharing analysis identified two novel resistance loci that attained chromosome-wise significance, protein tyrosine phosphatase receptor T (PTPRT; P=4.8 × 10 -7) and myosin IIIB (MYO3B; P=5.4 × 10 -6). We estimated that 21% of the phenotypic variance in TB resistance could be explained by all of the informative single-nucleotide polymorphisms, of which the region encompassing the PTPRT gene accounted for 6.2% of the variance and a further 3.6% was associated with a putative copy number variant in MYO3B the results from this study add to our understanding of variation in host control of infection and suggest that genetic marker-based selection for resistance to bTB has the potential to make a significant contribution to bTB control.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Defining Simulation Intent involves capturing high level modelling and idealisation decisions in order to create an efficient and fit-for-purpose analysis. These decisions are recorded as attributes of the decomposed design space.

An approach to defining Simulation Intent is described utilising three known technologies: Cellular Modelling, the subdivision of space into volumes of simulation significance (structures, gas paths, internal and external airflows etc.); Equivalencing, maintaining a consistent and coherent description
of the equivalent representations of the spatial cells in different analysis models; and Virtual Topology, which offers tools for partitioning and de-partitioning the model without disturbing the manufacturing oriented design geometry. The end result is a convenient framework to which high level analysis attributes can be applied, and from which detailed analysis models can be generated
with a high degree of controllability, repeatability and automation. There are multiple novel aspects to the approach, including its reusability, robustness to changes in model topology and the inherent links created between analysis models at different levels of fidelity and physics.

By utilising Simulation Intent, CAD modelling for simulation can be fully exploited and simulation work-flows can be more readily automated, reducing many repetitive manual tasks (e.g. the definition of appropriate coupling between elements of different types and the application of boundary conditions). The approach has been implemented and tested with practical examples, and
significant benefits are demonstrated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A parametric regression model for right-censored data with a log-linear median regression function and a transformation in both response and regression parts, named parametric Transform-Both-Sides (TBS) model, is presented. The TBS model has a parameter that handles data asymmetry while allowing various different distributions for the error, as long as they are unimodal symmetric distributions centered at zero. The discussion is focused on the estimation procedure with five important error distributions (normal, double-exponential, Student's t, Cauchy and logistic) and presents properties, associated functions (that is, survival and hazard functions) and estimation methods based on maximum likelihood and on the Bayesian paradigm. These procedures are implemented in TBSSurvival, an open-source fully documented R package. The use of the package is illustrated and the performance of the model is analyzed using both simulated and real data sets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents new results for the (partial) maximum a posteriori (MAP) problem in Bayesian networks, which is the problem of querying the most probable state configuration of some of the network variables given evidence. It is demonstrated that the problem remains hard even in networks with very simple topology, such as binary polytrees and simple trees (including the Naive Bayes structure), which extends previous complexity results. Furthermore, a Fully Polynomial Time Approximation Scheme for MAP in networks with bounded treewidth and bounded number of states per variable is developed. Approximation schemes were thought to be impossible, but here it is shown otherwise under the assumptions just mentioned, which are adopted in most applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents new results for the (partial) maximum a posteriori (MAP) problem in Bayesian networks, which is the problem of querying the most probable state configuration of some of the network variables given evidence. First, it is demonstrated that the problem remains hard even in networks with very simple topology, such as binary polytrees and simple trees (including the Naive Bayes structure). Such proofs extend previous complexity results for the problem. Inapproximability results are also derived in the case of trees if the number of states per variable is not bounded. Although the problem is shown to be hard and inapproximable even in very simple scenarios, a new exact algorithm is described that is empirically fast in networks of bounded treewidth and bounded number of states per variable. The same algorithm is used as basis of a Fully Polynomial Time Approximation Scheme for MAP under such assumptions. Approximation schemes were generally thought to be impossible for this problem, but we show otherwise for classes of networks that are important in practice. The algorithms are extensively tested using some well-known networks as well as random generated cases to show their effectiveness.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Artificial neural network (ANN) methods are used to predict forest characteristics. The data source is the Southeast Alaska (SEAK) Grid Inventory, a ground survey compiled by the USDA Forest Service at several thousand sites. The main objective of this article is to predict characteristics at unsurveyed locations between grid sites. A secondary objective is to evaluate the relative performance of different ANNs. Data from the grid sites are used to train six ANNs: multilayer perceptron, fuzzy ARTMAP, probabilistic, generalized regression, radial basis function, and learning vector quantization. A classification and regression tree method is used for comparison. Topographic variables are used to construct models: latitude and longitude coordinates, elevation, slope, and aspect. The models classify three forest characteristics: crown closure, species land cover, and tree size/structure. Models are constructed using n-fold cross-validation. Predictive accuracy is calculated using a method that accounts for the influence of misclassification as well as measuring correct classifications. The probabilistic and generalized regression networks are found to be the most accurate. The predictions of the ANN models are compared with a classification of the Tongass national forest in southeast Alaska based on the interpretation of satellite imagery and are found to be of similar accuracy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Jutland peninsula in northern Denmark is home to the Limfjord, one of the largest estuarine bodies of water in the region. Human inhabitance of the Limfjord’s surrounding coastlines stretches back further than 7,800 cal BP, with anthropogenic influence on the landscape beginning approximately 6,000 cal BP. Understanding how the Limfjord as a system has changed throughout time is useful in comprehending subsistence patterns and anthropogenic influence. This research is part of a larger project aimed at discerning subsistence patterns and environmental change in the region. Following the Younger Dryas, as the Fennoscandian ice sheet began to melt, Denmark experienced isostatic rebound, which contributed to the complex sea level history in the region. Between ice melt and isostatic rebound, the Jutland peninsula experienced many transgression and regression events. Connections to surrounding seas have shifted throughout time, with most attention focused on the western connection of the Limfjord with the North Sea, which has experienced numerous closures and subsequent re-openings throughout the Holocene. Furthermore, the Limfjord-North Sea connection has been the focal point of research because of the west to east water flow in the system, and the present day higher salinity in the west compared to the east. Little to no consideration has been paid to the influence of the Kattegat and Baltic on the Limfjord until now. A 10m sediment core was taken from Sebbersund (near Nibe, Limfjord), along the connection between the Limfjord and the Kattegat in the east to understand how the eastern part of the system has changed and differed from changes observed in the west. The Sebbersund sequence spans a majority of the Holocene, from 9600 cal BP to 1030 cal BP, determined via radiocarbon dating of terrestrial macrofossils and bulk sediment. Over this time period palaeoenvironmental conditions were reconstructed through the use of geochemical analyses (13C, 15N, C:N), physical sediment analyses, dinoflagellate cyst abundances and molluscan analyses. apart from two instances of low salinity, one at the top and one at the bottom of the core, the sequence has a strong marine signal for a majority of the Holocene. Radiocarbon dating of bulk sediment samples showed the presence of old carbon in the system, creating an age offset between 1,300 ± 200 and 2,800 ± 200 calibrated 14C years compared to the age-depth curve based on the terrestrial macrofossils. This finding, along with the strong marine influence in the system, discerned through geochemical data, dinoflagellate cyst and mollusc counts, is important for obtaining accurate radiocarbon ages in the region and stresses the importance of understanding both the marine and freshwater reservoir effects. The marine dominance in the eastern Limfjord differs from the west, which is characterized by a number of freshwater events when the North Sea connection was closed off during the Holocene. The eastern connection was open to the Kattegat throughout a large portion of the Holocene, with influx of open ocean water entering the system during periods of higher sea level.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Blood culture contamination (BCC) has been associated with unnecessary antibiotic use, additional laboratory tests and increased length of hospital stay thus incurring significant extra hospital costs. We set out to assess the impact of a staff educational intervention programme on decreasing intensive care unit (ICU) BCC rates to <3% (American Society for Microbiology standard). BCC rates during the pre-intervention period (January 2006-May 2011) were compared with the intervention period (June 2011-December 2012) using run chart and regression analysis. Monthly ICU BCC rates during the intervention period were reduced to a mean of 3·7%, compared to 9·5% during the baseline period (P < 0·001) with an estimated potential annual cost savings of about £250 100. The approach used was simple in design, flexible in delivery and efficient in outcomes, and may encourage its translation into clinical practice in different healthcare settings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION:

It has been widely suggested that the prevalence of myopia is growing worldwide, and that the increases observed in East Asia, in particular, are sufficiently severe as to warrant the term "epidemic". Data in favour of a cohort effect in myopia prevalence are reviewed, with attention to significant shortcomings in the quality of available evidence. Additional factors contributing to myopia prevalence, including near work, genetics and socioeconomic status, are detailed.

MATERIALS AND METHODS:

Medline search of articles regarding myopia prevalence, trends and mechanisms.

RESULTS:

Age-related changes in myopia prevalence (increase during childhood, and regression in the fifth and sixth decades) are discussed as an alternative explanation for cross-sectional patterns in myopia prevalence. There have only been a handful of studies that have examined the relative contribution of longitudinal changes in refraction over life and birth cohort differences on age-specific myopia prevalence as measured in cross-sectional studies. Available data suggest that both longitudinal changes and cohort effects may be present, and that their relative contribution may differ in different racial groups.

CONCLUSIONS:

In view of the relatively weak evidence in favour of a large cohort effect for myopia in East Asia, and the even greater lack of evidence for increased prevalence of secondary ocular pathology, there appears to be inadequate support for large-scale interventions to prevent or delay myopia at the present time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This Integration Insight provides a brief overview of the most popular modelling techniques used to analyse complex real-world problems, as well as some less popular but highly relevant techniques. The modelling methods are divided into three categories, with each encompassing a number of methods, as follows: 1) Qualitative Aggregate Models (Soft Systems Methodology, Concept Maps and Mind Mapping, Scenario Planning, Causal (Loop) Diagrams), 2) Quantitative Aggregate Models (Function fitting and Regression, Bayesian Nets, System of differential equations / Dynamical systems, System Dynamics, Evolutionary Algorithms) and 3) Individual Oriented Models (Cellular Automata, Microsimulation, Agent Based Models, Discrete Event Simulation, Social Network
Analysis). Each technique is broadly described with example uses, key attributes and reference material.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Power capping is a fundamental method for reducing the energy consumption of a wide range of modern computing environments, ranging from mobile embedded systems to datacentres. Unfortunately, maximising performance and system efficiency under static power caps remains challenging, while maximising performance under dynamic power caps has been largely unexplored. We present an adaptive power capping method that reduces the power consumption and maximizes the performance of heterogeneous SoCs for mobile and server platforms. Our technique combines power capping with coordinated DVFS, data partitioning and core allocations on a heterogeneous SoC with ARM processors and FPGA resources. We design our framework as a run-time system based on OpenMP and OpenCL to utilise the heterogeneous resources. We evaluate it through five data-parallel benchmarks on the Xilinx SoC which allows fully voltage and frequency control. Our experiments show a significant performance boost of 30% under dynamic power caps with concurrent execution on ARM and FPGA, compared to a naive separate approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we present a convolutional neuralnetwork (CNN)-based model for human head pose estimation inlow-resolution multi-modal RGB-D data. We pose the problemas one of classification of human gazing direction. We furtherfine-tune a regressor based on the learned deep classifier. Next wecombine the two models (classification and regression) to estimateapproximate regression confidence. We present state-of-the-artresults in datasets that span the range of high-resolution humanrobot interaction (close up faces plus depth information) data tochallenging low resolution outdoor surveillance data. We buildupon our robust head-pose estimation and further introduce anew visual attention model to recover interaction with theenvironment. Using this probabilistic model, we show thatmany higher level scene understanding like human-human/sceneinteraction detection can be achieved. Our solution runs inreal-time on commercial hardware