941 resultados para Predictive


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research evaluates pattern recognition techniques on a subclass of big data where the dimensionality of the input space (p) is much larger than the number of observations (n). Specifically, we evaluate massive gene expression microarray cancer data where the ratio κ is less than one. We explore the statistical and computational challenges inherent in these high dimensional low sample size (HDLSS) problems and present statistical machine learning methods used to tackle and circumvent these difficulties. Regularization and kernel algorithms were explored in this research using seven datasets where κ < 1. These techniques require special attention to tuning necessitating several extensions of cross-validation to be investigated to support better predictive performance. While no single algorithm was universally the best predictor, the regularization technique produced lower test errors in five of the seven datasets studied.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A novel simulation model for pyrolysis processes oflignocellulosicbiomassin AspenPlus (R) was presented at the BC&E 2013. Based on kinetic reaction mechanisms, the simulation calculates product compositions and yields depending on reactor conditions (temperature, residence time, flue gas flow rate) and feedstock composition (biochemical composition, atomic composition, ash and alkali metal content). The simulation model was found to show good correlation with existing publications. In order to further verify the model, own pyrolysis experiments in a 1 kg/h continuously fed fluidized bed fast pyrolysis reactor are performed. Two types of biomass with different characteristics are processed in order to evaluate the influence of the feedstock composition on the yields of the pyrolysis products and their composition. One wood and one straw-like feedstock are used due to their different characteristics. Furthermore, the temperature response of yields and product compositions is evaluated by varying the reactor temperature between 450 and 550 degrees C for one of the feedstocks. The yields of the pyrolysis products (gas, oil, char) are determined and their detailed composition is analysed. The experimental runs are reproduced with the corresponding reactor conditions in the AspenPlus model and the results compared with the experimental findings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to examine challenges and potential of big data in heterogeneous business networks and relate these to an implemented logistics solution. Design/methodology/approach – The paper establishes an overview of challenges and opportunities of current significance in the area of big data, specifically in the context of transparency and processes in heterogeneous enterprise networks. Within this context, the paper presents how existing components and purpose-driven research were combined for a solution implemented in a nationwide network for less-than-truckload consignments. Findings – Aside from providing an extended overview of today’s big data situation, the findings have shown that technical means and methods available today can comprise a feasible process transparency solution in a large heterogeneous network where legacy practices, reporting lags and incomplete data exist, yet processes are sensitive to inadequate policy changes. Practical implications – The means introduced in the paper were found to be of utility value in improving process efficiency, transparency and planning in logistics networks. The particular system design choices in the presented solution allow an incremental introduction or evolution of resource handling practices, incorporating existing fragmentary, unstructured or tacit knowledge of experienced personnel into the theoretically founded overall concept. Originality/value – The paper extends previous high-level view on the potential of big data, and presents new applied research and development results in a logistics application.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A Bázel–2. tőkeegyezmény bevezetését követően a bankok és hitelintézetek Magyarországon is megkezdték saját belső minősítő rendszereik felépítését, melyek karbantartása és fejlesztése folyamatos feladat. A szerző arra a kérdésre keres választ, hogy lehetséges-e a csőd-előrejelző modellek előre jelző képességét növelni a hagyományos matematikai-statisztikai módszerek alkalmazásával oly módon, hogy a modellekbe a pénzügyi mutatószámok időbeli változásának mértékét is beépítjük. Az empirikus kutatási eredmények arra engednek következtetni, hogy a hazai vállalkozások pénzügyi mutatószámainak időbeli alakulása fontos információt hordoz a vállalkozás jövőbeli fizetőképességéről, mivel azok felhasználása jelentősen növeli a csődmodellek előre jelző képességét. A szerző azt is megvizsgálja, hogy javítja-e a megfigyelések szélsőségesen magas vagy alacsony értékeinek modellezés előtti korrekciója a modellek klasszifikációs teljesítményét. ______ Banks and lenders in Hungary also began, after the introduction of the Basel 2 capital agreement, to build up their internal rating systems, whose maintenance and development are a continuing task. The author explores whether it is possible to increase the predictive capacity of business-failure forecasting models by traditional mathematical-cum-statistical means in such a way that they incorporate the measure of change in the financial indicators over time. Empirical findings suggest that the temporal development of the financial indicators of firms in Hungary carries important information about future ability to pay, since the predictive capacity of bankruptcy forecasting models is greatly increased by using such indicators. The author also examines whether the classification performance of the models can be improved by correcting for extremely high or low values before modelling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Homework has been a controversial issue in education for the past century. Research has been scarce and has yielded results at both ends of the spectrum. This study examined the relationship between homework performance (percent of homework completed and percent of homework correct), student characteristics (SAT-9 score, gender, ethnicity, and socio-economic status), perceptions, and challenges and academic achievement determined by the students' average score on weekly tests and their score on the FCAT NRT mathematics assessment. ^ The subjects for this study consisted of 143 students enrolled in Grade 3 at a suburban elementary school in Miami, Florida. Pearson's correlations were used to examine the associations of the predictor variables with average test scores and FCAT NRT scores. Additionally, simultaneous regression analyses were carried out to examine the influence of the predictor variables on each of the criterion variables. Hierarchical regression analyses were performed on the criterion variables from the predictor variables. ^ Homework performance was significantly correlated with average test score. Controlling for the other variables homework performance was highly related to average test score and FCAT NRT score. ^ This study lends support to the view that homework completion is highly related to student academic achievement at the lower elementary level. It is suggested that at the elementary level more consideration be given to the amount of homework completed by students and to utilize the information in formulating intervention strategies for student who may not be achieving at the appropriate levels. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Groundwater systems of different densities are often mathematically modeled to understand and predict environmental behavior such as seawater intrusion or submarine groundwater discharge. Additional data collection may be justified if it will cost-effectively aid in reducing the uncertainty of a model's prediction. The collection of salinity, as well as, temperature data could aid in reducing predictive uncertainty in a variable-density model. However, before numerical models can be created, rigorous testing of the modeling code needs to be completed. This research documents the benchmark testing of a new modeling code, SEAWAT Version 4. The benchmark problems include various combinations of density-dependent flow resulting from variations in concentration and temperature. The verified code, SEAWAT, was then applied to two different hydrological analyses to explore the capacity of a variable-density model to guide data collection. ^ The first analysis tested a linear method to guide data collection by quantifying the contribution of different data types and locations toward reducing predictive uncertainty in a nonlinear variable-density flow and transport model. The relative contributions of temperature and concentration measurements, at different locations within a simulated carbonate platform, for predicting movement of the saltwater interface were assessed. Results from the method showed that concentration data had greater worth than temperature data in reducing predictive uncertainty in this case. Results also indicated that a linear method could be used to quantify data worth in a nonlinear model. ^ The second hydrological analysis utilized a model to identify the transient response of the salinity, temperature, age, and amount of submarine groundwater discharge to changes in tidal ocean stage, seasonal temperature variations, and different types of geology. The model was compared to multiple kinds of data to (1) calibrate and verify the model, and (2) explore the potential for the model to be used to guide the collection of data using techniques such as electromagnetic resistivity, thermal imagery, and seepage meters. Results indicated that the model can be used to give insight to submarine groundwater discharge and be used to guide data collection. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study examined the relationship between homework performance (percent of homework completed and percent of homework correct), student characteristics (Stanford Achievement Test score, gender, ethnicity, and socio-economic status), perceptions, and challenges and academic achievement determined by the students’ average score on weekly tests and their score on the Florida Comprehensive Assessment Test (FCAT) Norm Reference Test (NRT) mathematics assessment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Corrigendum European Journal of Human Genetics (2016) 24, 1515; doi:10.1038/ejhg.2016.81 22 Years of predictive testing for Huntington’s disease: the experience of the UK Huntington’s Prediction Consortium Sheharyar S Baig, Mark Strong, Elisabeth Rosser, Nicola V Taverner, Ruth Glew, Zosia Miedzybrodzka, Angus Clarke, David Craufurd, UK Huntington's Disease Prediction Consortium and Oliver W Quarrell Correction to: European Journal of Human Genetics advance online publication, 11 May 2016; doi: 10.1038/ejhg.2016.36 Post online publication the authors realised that they had made an error: The sentence on page 2: 'In the first 5-year period........but this changed significantly in the last 5-year period with 51% positive and 49% negative (χ2=20.6, P<0.0001)' should read: 'In the first 5-year period........but this changed significantly in the last 5-year period with 49% positive and 51% negative (χ2=20.6, P<0.0001)'.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As we look around a scene, we perceive it as continuous and stable even though each saccadic eye movement changes the visual input to the retinas. How the brain achieves this perceptual stabilization is unknown, but a major hypothesis is that it relies on presaccadic remapping, a process in which neurons shift their visual sensitivity to a new location in the scene just before each saccade. This hypothesis is difficult to test in vivo because complete, selective inactivation of remapping is currently intractable. We tested it in silico with a hierarchical, sheet-based neural network model of the visual and oculomotor system. The model generated saccadic commands to move a video camera abruptly. Visual input from the camera and internal copies of the saccadic movement commands, or corollary discharge, converged at a map-level simulation of the frontal eye field (FEF), a primate brain area known to receive such inputs. FEF output was combined with eye position signals to yield a suitable coordinate frame for guiding arm movements of a robot. Our operational definition of perceptual stability was "useful stability," quantified as continuously accurate pointing to a visual object despite camera saccades. During training, the emergence of useful stability was correlated tightly with the emergence of presaccadic remapping in the FEF. Remapping depended on corollary discharge but its timing was synchronized to the updating of eye position. When coupled to predictive eye position signals, remapping served to stabilize the target representation for continuously accurate pointing. Graded inactivations of pathways in the model replicated, and helped to interpret, previous in vivo experiments. The results support the hypothesis that visual stability requires presaccadic remapping, provide explanations for the function and timing of remapping, and offer testable hypotheses for in vivo studies. We conclude that remapping allows for seamless coordinate frame transformations and quick actions despite visual afferent lags. With visual remapping in place for behavior, it may be exploited for perceptual continuity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: It is unclear whether diagnostic protocols based on cardiac markers to identify low-risk chest pain patients suitable for early release from the emergency department can be applied to patients older than 65 years or with traditional cardiac risk factors. METHODS AND RESULTS: In a single-center retrospective study of 231 consecutive patients with high-risk factor burden in which a first cardiac troponin (cTn) level was measured in the emergency department and a second cTn sample was drawn 4 to 14 hours later, we compared the performance of a modified 2-Hour Accelerated Diagnostic Protocol to Assess Patients with Chest Pain Using Contemporary Troponins as the Only Biomarker (ADAPT) rule to a new risk classification scheme that identifies patients as low risk if they have no known coronary artery disease, a nonischemic electrocardiogram, and 2 cTn levels below the assay's limit of detection. Demographic and outcome data were abstracted through chart review. The median age of our population was 64 years, and 75% had Thrombosis In Myocardial Infarction risk score ≥2. Using our risk classification rule, 53 (23%) patients were low risk with a negative predictive value for 30-day cardiac events of 98%. Applying a modified ADAPT rule to our cohort, 18 (8%) patients were identified as low risk with a negative predictive value of 100%. In a sensitivity analysis, the negative predictive value of our risk algorithm did not change when we relied only on undetectable baseline cTn and eliminated the second cTn assessment. CONCLUSIONS: If confirmed in prospective studies, this less-restrictive risk classification strategy could be used to safely identify chest pain patients with more traditional cardiac risk factors for early emergency department release.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We analyzed projections of current and future ambient temperatures along the eastern United States in relationship to the thermal tolerance of harbor seals in air. Using the earth systems model (HadGEM2-ES) and representative concentration pathways (RCPs) 4.5 and 8.5, which are indicative of two different atmospheric CO2 concentrations, we were able to examine possible shifts in distribution based on three metrics: current preferences, the thermal limit of juveniles, and the thermal limits of adults. Our analysis focused on average ambient temperatures because harbor seals are least effective at regulating their body temperature in air, making them most susceptible to rising air temperatures in the coming years. Our study focused on the months of May, June, and August from 2041-2060 (2050) and 2061-2080 (2070) as these are the historic months in which harbor seals are known to annually come ashore to pup, breed, and molt. May, June, and August are also some of the warmest months of the year. We found that breeding colonies along the eastern United States will be limited by the thermal tolerance of juvenile harbor seals in air, while their foraging range will extend as far south as the thermal tolerance of adult harbor seals in air. Our analysis revealed that in 2070, harbor seal pups should be absent from the United States coastline nearing the end of the summer due to exceptionally high air temperatures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

If patients at risk of admission or readmission to hospital or other forms of care could be identified and offered suitable early interventions then their lives and long-term health may be improved by reducing the chances of future admission or readmission to care, and hopefully, their cost of care reduced. Considerable work has been carried out in this subject area especially in the USA and the UK. This has led for instance to the development of tools such as PARR, PARR-30, and the Combined Predictive Model for prediction of emergency readmission or admission to acute care. Here we perform a structured review the academic and grey literature on predictive risk tools for social care utilisation, as well as admission and readmission to general hospitals and psychiatric hospitals. This is the first phase of a project in partnership with Docobo Ltd and funded by Innovate UK,in which we seek to develop novel predictive risk tools and dashboards to assist commissioners in Clinical Commissioning Groups with the triangulation of the intelligence available from routinely collected data to optimise integrated care and better understand the complex needs of individuals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Measuring and tracking athletic performance is crucial to an athlete’s development and the countermovement vertical jump is often used to measure athletic performance, particularly lower limb power. The linear power developed in the lower limb is estimated through jump height. However, the relationship between angular power, produced by the joints of the lower limb, and jump height is not well understood. This study examined the contributions of the kinetic value of angular power, and its kinematic component, angular velocity, of the lower limb joints to jump height in the countermovement vertical jump. Kinematic and kinetic data were gathered from twenty varsity-level basketball and volleyball athletes as they performed six maximal effort jumps in four arm swing conditions: no-arm involvement, single-non-dominant arm swing, single-dominant arm swing, and two-arm swing. The displacement of the whole body centre of mass, peak joint powers, peak angular velocity, and locations of the peaks as a percentage of the jump’s takeoff period, were computed. Linear regressions assessed the relationship of the variables to jump height. Results demonstrated that knee peak power (p = 0.001, ß = 0.363, r = 0.363), its location within takeoff period (p = 0.023, ß = -0.256, r = 0.256), and peak knee peak angular velocity (p = 0.005, ß = 0.310, r = 0.310) were moderately linked to increased jump height. Additionally, the location, within the takeoff period, of the peak angular velocities of the hip (p = 0.003, ß = -0.318, r = 0.419) and ankle (p = 0.011, ß = 0.270, r = 0.419) were positively linked to jump height. These results highlight the importance of training the velocity and timing of joint motion beyond traditional power training protocols as well as the importance of further investigation into appropriate testing protocol that is sensitive to the contributions by individual joints in maximal effort jumping.