787 resultados para Real world data
Resumo:
In this paper, we discuss some practical implications for implementing adaptable network algorithms applied to non-stationary time series problems. Two real world data sets, containing electricity load demands and foreign exchange market prices, are used to test several different methods, ranging from linear models with fixed parameters, to non-linear models which adapt both parameters and model order on-line. Training with the extended Kalman filter, we demonstrate that the dynamic model-order increment procedure of the resource allocating RBF network (RAN) is highly sensitive to the parameters of the novelty criterion. We investigate the use of system noise for increasing the plasticity of the Kalman filter training algorithm, and discuss the consequences for on-line model order selection. The results of our experiments show that there are advantages to be gained in tracking real world non-stationary data through the use of more complex adaptive models.
Resumo:
The aim of the thesis is to assess the impact of depression in people with type 2 diabetes. Using Healthcare Utilization Databases, I estimated in a large population-based cohort with type 2 diabetes the incidence of depression over 10 year-period, identified the demographic and clinical predictors of depression, and determined the extent to which depression is a risk factor for acute and long-term complications and mortality. In the context of COVID-19 pandemic, I evaluated whether the presence of a history of depression in type 2 diabetes increased the Emergency Department (ED) access rate for diabetes-related complications, and I investigated changes in the incidence of depression during the first year of the pandemic. Findings from the first study indicated that developing depression was associated with being a woman, being over 65 years, living in rural areas, having insulin as initial diabetes medication and having comorbid conditions; the study also confirmed that depression was associated with an increased risk for acute and long-term diabetes complications and all-cause mortality. The second observational study showed a higher rate of ED access for diabetes-related complications during the pandemic in people with type 2 diabetes and a history of depression than in those without a history of depression, similar to what was observed in a pre-pandemic period. As shown in the third population-based study, the incidence of depression decreased in 2020 compared to 2019, mainly during the first and the second waves of the COVID-19 pandemic, when people probably had difficulty reaching healthcare services. This new real-world evidence will help healthcare professionals identify timely patients at high risk of developing depression. Lastly, policymakers and physicians will benefit from new evidence of the effects of the COVID-19 pandemic on depression in people with type 2 diabetes to ensure a high level of care during crisis periods.
Resumo:
The Multiple Pheromone Ant Clustering Algorithm (MPACA) models the collective behaviour of ants to find clusters in data and to assign objects to the most appropriate class. It is an ant colony optimisation approach that uses pheromones to mark paths linking objects that are similar and potentially members of the same cluster or class. Its novelty is in the way it uses separate pheromones for each descriptive attribute of the object rather than a single pheromone representing the whole object. Ants that encounter other ants frequently enough can combine the attribute values they are detecting, which enables the MPACA to learn influential variable interactions. This paper applies the model to real-world data from two domains. One is logistics, focusing on resource allocation rather than the more traditional vehicle-routing problem. The other is mental-health risk assessment. The task for the MPACA in each domain was to predict class membership where the classes for the logistics domain were the levels of demand on haulage company resources and the mental-health classes were levels of suicide risk. Results on these noisy real-world data were promising, demonstrating the ability of the MPACA to find patterns in the data with accuracy comparable to more traditional linear regression models. © 2013 Polish Information Processing Society.
Resumo:
Objectives: Previous research conducted in the late 1980s suggested that vehicle impacts following an initial barrier collision increase severe occupant injury risk. Now over 25years old, the data are no longer representative of the currently installed barriers or the present US vehicle fleet. The purpose of this study is to provide a present-day assessment of secondary collisions and to determine if current full-scale barrier crash testing criteria provide an indication of secondary collision risk for real-world barrier crashes. Methods: To characterize secondary collisions, 1,363 (596,331 weighted) real-world barrier midsection impacts selected from 13years (1997-2009) of in-depth crash data available through the National Automotive Sampling System (NASS) / Crashworthiness Data System (CDS) were analyzed. Scene diagram and available scene photographs were used to determine roadside and barrier specific variables unavailable in NASS/CDS. Binary logistic regression models were developed for second event occurrence and resulting driver injury. To investigate current secondary collision crash test criteria, 24 full-scale crash test reports were obtained for common non-proprietary US barriers, and the risk of secondary collisions was determined using recommended evaluation criteria from National Cooperative Highway Research Program (NCHRP) Report 350. Results: Secondary collisions were found to occur in approximately two thirds of crashes where a barrier is the first object struck. Barrier lateral stiffness, post-impact vehicle trajectory, vehicle type, and pre-impact tracking conditions were found to be statistically significant contributors to secondary event occurrence. The presence of a second event was found to increase the likelihood of a serious driver injury by a factor of 7 compared to cases with no second event present. The NCHRP Report 350 exit angle criterion was found to underestimate the risk of secondary collisions in real-world barrier crashes. Conclusions: Consistent with previous research, collisions following a barrier impact are not an infrequent event and substantially increase driver injury risk. The results suggest that using exit-angle based crash test criteria alone to assess secondary collision risk is not sufficient to predict second collision occurrence for real-world barrier crashes.
Resumo:
This study describes the pedagogical impact of real-world experimental projects undertaken as part of an advanced undergraduate Fluid Mechanics subject at an Australian university. The projects have been organised to complement traditional lectures and introduce students to the challenges of professional design, physical modelling, data collection and analysis. The physical model studies combine experimental, analytical and numerical work in order to develop students’ abilities to tackle real-world problems. A first study illustrates the differences between ideal and real fluid flow force predictions based upon model tests of buildings in a large size wind tunnel used for research and professional testing. A second study introduces the complexity arising from unsteady non-uniform wave loading on a sheltered pile. The teaching initiative is supported by feedback from undergraduate students. The pedagogy of the course and projects is discussed with reference to experiential, project-based and collaborative learning. The practical work complements traditional lectures and tutorials, and provides opportunities which cannot be learnt in the classroom, real or virtual. Student feedback demonstrates a strong interest for the project phases of the course. This was associated with greater motivation for the course, leading in turn to lower failure rates. In terms of learning outcomes, the primary aim is to enable students to deliver a professional report as the final product, where physical model data are compared to ideal-fluid flow calculations and real-fluid flow analyses. Thus the students are exposed to a professional design approach involving a high level of expertise in fluid mechanics, with sufficient academic guidance to achieve carefully defined learning goals, while retaining sufficient flexibility for students to construct there own learning goals. The overall pedagogy is a blend of problem-based and project-based learning, which reflects academic research and professional practice. The assessment is a mix of peer-assessed oral presentations and written reports that aims to maximise student reflection and development. Student feedback indicated a strong motivation for courses that include a well-designed project component.
Resumo:
AIM: This work presents detailed experimental performance results from tests executed in the hospital environment for Health Monitoring for All (HM4All), a remote vital signs monitoring system based on a ZigBee® (ZigBee Alliance, San Ramon, CA) body sensor network (BSN). MATERIALS AND METHODS: Tests involved the use of six electrocardiogram (ECG) sensors operating in two different modes: the ECG mode involved the transmission of ECG waveform data and heart rate (HR) values to the ZigBee coordinator, whereas the HR mode included only the transmission of HR values. In the absence of hidden nodes, a non-beacon-enabled star network composed of sensing devices working on ECG mode kept the delivery ratio (DR) at 100%. RESULTS: When the network topology was changed to a 2-hop tree, the performance degraded slightly, resulting in an average DR of 98.56%. Although these performance outcomes may seem satisfactory, further investigation demonstrated that individual sensing devices went through transitory periods with low DR. Other tests have shown that ZigBee BSNs are highly susceptible to collisions owing to hidden nodes. Nevertheless, these tests have also shown that these networks can achieve high reliability if the amount of traffic is kept low. Contrary to what is typically shown in scientific articles and in manufacturers' documentation, the test outcomes presented in this article include temporal graphs of the DR achieved by each wireless sensor device. CONCLUSIONS: The test procedure and the approach used to represent its outcomes, which allow the identification of undesirable transitory periods of low reliability due to contention between devices, constitute the main contribution of this work.
Resumo:
Currently, due to the widespread use of computers and the internet, students are trading libraries for the World Wide Web and laboratories with simulation programs. In most courses, simulators are made available to students and can be used to proof theoretical results or to test a developing hardware/product. Although this is an interesting solution: low cost, easy and fast way to perform some courses work, it has indeed major disadvantages. As everything is currently being done with/in a computer, the students are loosing the “feel” of the real values of the magnitudes. For instance in engineering studies, and mainly in the first years, students need to learn electronics, algorithmic, mathematics and physics. All of these areas can use numerical analysis software, simulation software or spreadsheets and in the majority of the cases data used is either simulated or random numbers, but real data could be used instead. For example, if a course uses numerical analysis software and needs a dataset, the students can learn to manipulate arrays. Also, when using the spreadsheets to build graphics, instead of using a random table, students could use a real dataset based, for instance, in the room temperature and its variation across the day. In this work we present a framework which uses a simple interface allowing it to be used by different courses where the computers are the teaching/learning process in order to give a more realistic feeling to students by using real data. A framework is proposed based on a set of low cost sensors for different physical magnitudes, e.g. temperature, light, wind speed, which are connected to a central server, that the students have access with an Ethernet protocol or are connected directly to the student computer/laptop. These sensors use the communication ports available such as: serial ports, parallel ports, Ethernet or Universal Serial Bus (USB). Since a central server is used, the students are encouraged to use sensor values results in their different courses and consequently in different types of software such as: numerical analysis tools, spreadsheets or simply inside any programming language when a dataset is needed. In order to do this, small pieces of hardware were developed containing at least one sensor using different types of computer communication. As long as the sensors are attached in a server connected to the internet, these tools can also be shared between different schools. This allows sensors that aren't available in a determined school to be used by getting the values from other places that are sharing them. Another remark is that students in the more advanced years and (theoretically) more know how, can use the courses that have some affinities with electronic development to build new sensor pieces and expand the framework further. The final solution provided is very interesting, low cost, simple to develop, allowing flexibility of resources by using the same materials in several courses bringing real world data into the students computer works.
Resumo:
Modeling Extract-Transform-Load (ETL) processes of a Data Warehousing System has always been a challenge. The heterogeneity of the sources, the quality of the data obtained and the conciliation process are some of the issues that must be addressed in the design phase of this critical component. Commercial ETL tools often provide proprietary diagrammatic components and modeling languages that are not standard, thus not providing the ideal separation between a modeling platform and an execution platform. This separation in conjunction with the use of standard notations and languages is critical in a system that tends to evolve through time and which cannot be undermined by a normally expensive tool that becomes an unsatisfactory component. In this paper we demonstrate the application of Relational Algebra as a modeling language of an ETL system as an effort to standardize operations and provide a basis for uncommon ETL execution platforms.
Resumo:
The incorporation of the Spanish university system into the European Higher Education Areahas brought about a series of adaptations. Among the recommendations is the inclusion ofan external training period in a company, which has resulted in significant changes in thedegree syllabus in order to balance the theoretical and practical education required by thestudents. This new framework has been legally confirmed by the Spanish Government and, inthe case of the University of Barcelona, by the publication of internal guidelines. Takingadvantage of this new opportunity to adapt the Pharmacy degree to real-world problems inindustry, the Dean’s team of the Faculty of Pharmacy, with the support of the Facultyadministrative staff and the Students Advisory Service, have assumed the challenge ofincluding a new subject in the syllabus entitled Training in Companies.In parallel, a new activity has been set up to ensure that the students choose the mostsuitable company department/job for them and to help them pass the company interview.Under the name of Passport to a Profession, a series of ten explanatory talks has beenscheduled every academic year. These talks deal with a broad range of topics aimed atproviding the students with the basic tools they will need to make the most of a companytraining period and to make headway in the professional world when they finish theirdegree. In addition, three Faculty of Pharmacy-Pharmaceutical company workshops and tworound-table conferences have been held in the last two years in order to bring the universityand industry together. Notably, the project to provide students with company training isexpanding on an international level, with two to three undergraduate students contractedevery year by a United Kingdom-based multinational pharmaceutical company.The statistical data of the whole process has been analysed for a more in-depthunderstanding of the activity and to improve the programme.
Resumo:
BACKGROUND: This study examined potential predictors of remission among patients treated for major depressive disorder (MDD) in a naturalistic clinical setting, mostly in the Middle East, East Asia, and Mexico. METHODS: Data for this post hoc analysis were taken from a 6-month prospective, noninterventional, observational study that involved 1,549 MDD patients without sexual dysfunction at baseline in 12 countries worldwide. Depression severity was measured using the Clinical Global Impression of Severity and the 16-item Quick Inventory of Depressive Symptomatology Self-Report (QIDS-SR16). Depression-related pain was measured using the pain-related items of the Somatic Symptom Inventory. Remission was defined as a QIDS-SR16 score ≤5. Generalized estimating equation regression models were used to examine baseline factors associated with remission during follow-up. RESULTS: Being from East Asia (odds ratio [OR] 0.48 versus Mexico; P<0.001), a higher level of depression severity at baseline (OR 0.77, P=0.003, for Clinical Global Impression of Severity; OR 0.92, P<0.001, for QIDS-SR16), more previous MDD episodes (OR 0.92, P=0.007), previous treatments/therapies for depression (OR 0.78, P=0.030), and having any significant psychiatric and medical comorbidity at baseline (OR 0.60, P<0.001) were negatively associated with remission, whereas being male (OR 1.29, P=0.026) and treatment with duloxetine (OR 2.38 versus selective serotonin reuptake inhibitors, P<0.001) were positively associated with remission. However, the association between Somatic Symptom Inventory pain scores and remission no longer appeared to be significant in this multiple regression (P=0.580), (P=0.008 in descriptive statistics), although it remained significant in a subgroup of patients treated with selective serotonin reuptake inhibitors (OR 0.97, P=0.023), but not in those treated with duloxetine (P=0.182). CONCLUSION: These findings are largely consistent with previous reports from the USA and Europe. They also highlight the potential mediating role of treatment with duloxetine on the negative relationship between depression-related pain and outcomes of depression.
Resumo:
The classical computer vision methods can only weakly emulate some of the multi-level parallelisms in signal processing and information sharing that takes place in different parts of the primates’ visual system thus enabling it to accomplish many diverse functions of visual perception. One of the main functions of the primates’ vision is to detect and recognise objects in natural scenes despite all the linear and non-linear variations of the objects and their environment. The superior performance of the primates’ visual system compared to what machine vision systems have been able to achieve to date, motivates scientists and researchers to further explore this area in pursuit of more efficient vision systems inspired by natural models. In this paper building blocks for a hierarchical efficient object recognition model are proposed. Incorporating the attention-based processing would lead to a system that will process the visual data in a non-linear way focusing only on the regions of interest and hence reducing the time to achieve real-time performance. Further, it is suggested to modify the visual cortex model for recognizing objects by adding non-linearities in the ventral path consistent with earlier discoveries as reported by researchers in the neuro-physiology of vision.
Resumo:
Building Risk-Neutral Densities (RND) from options data can provide market-implied expectations about the future behavior of a financial variable. And market expectations on financial variables may influence macroeconomic policy decisions. It can be useful also for corporate and financial institutions decision making. This paper uses the Liu et all (2007) approach to estimate the option-implied Risk-neutral densities from the Brazilian Real/US Dollar exchange rate distribution. We then compare the RND with actual exchange rates, on a monthly basis, in order to estimate the relative risk-aversion of investors and also obtain a Real-world density for the exchange rate. We are the first to calculate relative risk-aversion and the option-implied Real World Density for an emerging market currency. Our empirical application uses a sample of Brazilian Real/US Dollar options traded at BM&F-Bovespa from 1999 to 2011. The RND is estimated using a Mixture of Two Log-Normals distribution and then the real-world density is obtained by means of the Liu et al. (2007) parametric risktransformations. The relative risk aversion is calculated for the full sample. Our estimated value of the relative risk aversion parameter is around 2.7, which is in line with other articles that have estimated this parameter for the Brazilian Economy, such as Araújo (2005) and Issler and Piqueira (2000). Our out-of-sample evaluation results showed that the RND has some ability to forecast the Brazilian Real exchange rate. Abe et all (2007) found also mixed results in the out-of-sample analysis of the RND forecast ability for exchange rate options. However, when we incorporate the risk aversion into RND in order to obtain a Real-world density, the out-of-sample performance improves substantially, with satisfactory results in both Kolmogorov and Berkowitz tests. Therefore, we would suggest not using the “pure” RND, but rather taking into account risk aversion in order to forecast the Brazilian Real exchange rate.
Resumo:
Outside of relatively limited crash testing with large trucks, very little is known regarding the performance of traffic barriers subjected to real-world large truck impacts. The purpose of this study was to investigate real-world large truck impacts into traffic barriers to determine barrier crash involvement rates, the impact performance of barriers not specifically designed to redirect large trucks, and the real-world performance of large-truck-specific barriers. Data sources included the Fatality Analysis Reporting System (2000-2009), the General Estimates System (2000-2009) and 155 in-depth large truck-to-barrier crashes from the Large Truck Crash Causation Study. Large truck impacts with a longitudinal barrier were found to comprise 3 percent of all police-reported longitudinal barrier impacts and roughly the same proportion of barrier fatalities. Based on a logistic regression model predicting barrier penetration, large truck barrier penetration risk was found to increase by a factor of 6 for impacts with barriers designed primarily for passenger vehicles. Although large-truck-specific barriers were found to perform better than non-heavy vehicle specific barriers, the penetration rate of these barriers were found to be 17 percent. This penetration rate is especially a concern because the higher test level barriers are designed to protect other road users, not the occupants of the large truck. Surprisingly, barriers not specifically designed for large truck impacts were found to prevent large truck penetration approximately half of the time. This suggests that adding costlier higher test level barriers may not always be warranted, especially on roadways with lower truck volumes.
Resumo:
Previous research conducted in the late 1980’s suggested that vehicle impacts following an initial barrier collision increase severe occupant injury risk. Now over twenty-five years old, the data used in the previous research is no longer representative of the currently installed barriers or US vehicle fleet. The purpose of this study is to provide a present-day assessment of secondary collisions and to determine if full-scale barrier crash testing criteria provide an indication of secondary collision risk for real-world barrier crashes. The analysis included 1,383 (596,331 weighted) real-world barrier midsection impacts selected from thirteen years (1997-2009) of in-depth crash data available through the National Automotive Sampling System (NASS) / Crashworthiness Data System (CDS). For each suitable case, the scene diagram and available scene photographs were used to determine roadside and barrier specific variables not available in NASS/CDS. Binary logistic regression models were developed for second event occurrence and resulting driver injury. Barrier lateral stiffness, post-impact vehicle trajectory, vehicle type, and pre-impact tracking conditions were found to be statistically significant contributors toward secondary event occurrence. The presence of a second event was found to increase the likelihood of a serious driver injury by a factor of seven compared to cases with no second event present. Twenty-four full-scale crash test reports were obtained for common non-proprietary US barriers, and the risk of secondary collisions was determined using recommended evaluation criteria from NCHRP Report 350. It was found that the NCHRP Report 350 exit angle criterion alone was not sufficient to predict second collision occurrence for real-world barrier crashes.