939 resultados para data-driven simulation


Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper we develop a data-driven weight learning method for weighted quasi-arithmetic means where the observed data may vary in dimension.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Il presente elaborato esplora l’attitudine delle organizzazioni nei confronti dei processi di business che le sostengono: dalla semi-assenza di struttura, all’organizzazione funzionale, fino all’avvento del Business Process Reengineering e del Business Process Management, nato come superamento dei limiti e delle problematiche del modello precedente. All’interno del ciclo di vita del BPM, trova spazio la metodologia del process mining, che permette un livello di analisi dei processi a partire dagli event data log, ossia dai dati di registrazione degli eventi, che fanno riferimento a tutte quelle attività supportate da un sistema informativo aziendale. Il process mining può essere visto come naturale ponte che collega le discipline del management basate sui processi (ma non data-driven) e i nuovi sviluppi della business intelligence, capaci di gestire e manipolare l’enorme mole di dati a disposizione delle aziende (ma che non sono process-driven). Nella tesi, i requisiti e le tecnologie che abilitano l’utilizzo della disciplina sono descritti, cosi come le tre tecniche che questa abilita: process discovery, conformance checking e process enhancement. Il process mining è stato utilizzato come strumento principale in un progetto di consulenza da HSPI S.p.A. per conto di un importante cliente italiano, fornitore di piattaforme e di soluzioni IT. Il progetto a cui ho preso parte, descritto all’interno dell’elaborato, ha come scopo quello di sostenere l’organizzazione nel suo piano di improvement delle prestazioni interne e ha permesso di verificare l’applicabilità e i limiti delle tecniche di process mining. Infine, nell’appendice finale, è presente un paper da me realizzato, che raccoglie tutte le applicazioni della disciplina in un contesto di business reale, traendo dati e informazioni da working papers, casi aziendali e da canali diretti. Per la sua validità e completezza, questo documento è stata pubblicato nel sito dell'IEEE Task Force on Process Mining.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Model predictive control (MPC) has often been referred to in literature as a potential method for more efficient control of building heating systems. Though a significant performance improvement can be achieved with an MPC strategy, the complexity introduced to the commissioning of the system is often prohibitive. Models are required which can capture the thermodynamic properties of the building with sufficient accuracy for meaningful predictions to be made. Furthermore, a large number of tuning weights may need to be determined to achieve a desired performance. For MPC to become a practicable alternative, these issues must be addressed. Acknowledging the impact of the external environment as well as the interaction of occupants on the thermal behaviour of the building, in this work, techniques have been developed for deriving building models from data in which large, unmeasured disturbances are present. A spatio-temporal filtering process was introduced to determine estimates of the disturbances from measured data, which were then incorporated with metaheuristic search techniques to derive high-order simulation models, capable of replicating the thermal dynamics of a building. While a high-order simulation model allowed for control strategies to be analysed and compared, low-order models were required for use within the MPC strategy itself. The disturbance estimation techniques were adapted for use with system-identification methods to derive such models. MPC formulations were then derived to enable a more straightforward commissioning process and implemented in a validated simulation platform. A prioritised-objective strategy was developed which allowed for the tuning parameters typically associated with an MPC cost function to be omitted from the formulation by separation of the conflicting requirements of comfort satisfaction and energy reduction within a lexicographic framework. The improved ability of the formulation to be set-up and reconfigured in faulted conditions was shown.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper proposes a process for the classifi cation of new residential electricity customers. The current state of the art is extended by using a combination of smart metering and survey data and by using model-based feature selection for the classifi cation task. Firstly, the normalized representative consumption profi les of the population are derived through the clustering of data from households. Secondly, new customers are classifi ed using survey data and a limited amount of smart metering data. Thirdly, regression analysis and model-based feature selection results explain the importance of the variables and which are the drivers of diff erent consumption profi les, enabling the extraction of appropriate models. The results of a case study show that the use of survey data signi ficantly increases accuracy of the classifi cation task (up to 20%). Considering four consumption groups, more than half of the customers are correctly classifi ed with only one week of metering data, with more weeks the accuracy is signifi cantly improved. The use of model-based feature selection resulted in the use of a signifi cantly lower number of features allowing an easy interpretation of the derived models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A number of intervention approaches have been developed to improve work-related driving safety. However, past interventions have been limited in that they have been data-driven, and have not been developed within a theoretical framework. The aim of this study is to present a theory-driven intervention. Based on the methodology developed by Ludwig and Geller (1991), this study evaluates the effectiveness of a participative education intervention on a group of work-related drivers (n = 28; experimental group n = 19, control n = 9). The results support the effectiveness of the intervention in reducing speeding over a six month period, while a non significant increase was found in the control group. The results of this study have important implications for organisations developing theory-driven interventions designed to improve work-related driving behaviour.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study explores through a lifestream narrative how the life experiences of a female primary school principal are organised as practical knowledge, and are used to inform action that is directed towards creating a sustainable school culture. An alternative model of school leadership is presented which describes the thinking and activity of a leader as a process. The process demonstrates how a leader's practical knowledge is dynamic, broadly based in experiential life, and open to change. As such, it is described as a model of sustainable leadership-in-process. The research questions at the heart of this study are: How does a leader construct and organize knowledge in the enactment of the principal ship to deal with the dilemmas and opportunities that arise everyday in school life? And: What does this particular way of organising knowledge look like in the effort to build a sustainable school community? The sustainable leadership-in-process thesis encapsulates new ways of leading primary schools through the principalship. These new ways are described as developing and maintaining the following dimensions of leadership: quality relationships, a collective (shared vision), collaboration and partnerships, and high achieving learning environments. Such dimensions are enacted by the principal through the activities of conversations, performance development, research and data-driven action, promoting innovation, and anticipating and predicting the future. Sustainable leadership-in-process is shared, dynamic, visible and transparent and is conducted through the processes of positioning, defining, organising, experimenting and evaluating in a continuous and iterative way. A rich understanding of the specificity of the life of a female primary school principal was achieved using story telling, story listening and story creation in a collaborative relationship between the researcher and the researched participant. as a means of educational theorising. Analysis and interpretation were undertaken as a recursive process in which the immediate interpretations were shared with the researched participant. The view of theorising adopted in this research is that of theory as hermeneutic; that is, theory is generated out of the stories of experiential life, rather than discovered in the stories.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a critical review of past research in the work-related driving field in light vehicle fleets (e.g., vehicles < 4.5 tonnes) and an intervention framework that provides future direction for practitioners and researchers. Although work-related driving crashes have become the most common cause of death, injury, and absence from work in Australia and overseas, very limited research has progressed in establishing effective strategies to improve safety outcomes. In particular, the majority of past research has been data-driven, and therefore, limited attention has been given to theoretical development in establishing the behavioural mechanism underlying driving behaviour. As such, this paper argues that to move forward in the field of work-related driving safety, practitioners and researchers need to gain a better understanding of the individual and organisational factors influencing safety through adopting relevant theoretical frameworks, which in turn will inform the development of specifically targeted theory-driven interventions. This paper presents an intervention framework that is based on relevant theoretical frameworks and sound methodological design, incorporating interventions that can be directed at the appropriate level, individual and driving target group.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There has been considerable research conducted over the last 20 years focused on predicting motor vehicle crashes on transportation facilities. The range of statistical models commonly applied includes binomial, Poisson, Poisson-gamma (or negative binomial), zero-inflated Poisson and negative binomial models (ZIP and ZINB), and multinomial probability models. Given the range of possible modeling approaches and the host of assumptions with each modeling approach, making an intelligent choice for modeling motor vehicle crash data is difficult. There is little discussion in the literature comparing different statistical modeling approaches, identifying which statistical models are most appropriate for modeling crash data, and providing a strong justification from basic crash principles. In the recent literature, it has been suggested that the motor vehicle crash process can successfully be modeled by assuming a dual-state data-generating process, which implies that entities (e.g., intersections, road segments, pedestrian crossings, etc.) exist in one of two states—perfectly safe and unsafe. As a result, the ZIP and ZINB are two models that have been applied to account for the preponderance of “excess” zeros frequently observed in crash count data. The objective of this study is to provide defensible guidance on how to appropriate model crash data. We first examine the motor vehicle crash process using theoretical principles and a basic understanding of the crash process. It is shown that the fundamental crash process follows a Bernoulli trial with unequal probability of independent events, also known as Poisson trials. We examine the evolution of statistical models as they apply to the motor vehicle crash process, and indicate how well they statistically approximate the crash process. We also present the theory behind dual-state process count models, and note why they have become popular for modeling crash data. A simulation experiment is then conducted to demonstrate how crash data give rise to “excess” zeros frequently observed in crash data. It is shown that the Poisson and other mixed probabilistic structures are approximations assumed for modeling the motor vehicle crash process. Furthermore, it is demonstrated that under certain (fairly common) circumstances excess zeros are observed—and that these circumstances arise from low exposure and/or inappropriate selection of time/space scales and not an underlying dual state process. In conclusion, carefully selecting the time/space scales for analysis, including an improved set of explanatory variables and/or unobserved heterogeneity effects in count regression models, or applying small-area statistical methods (observations with low exposure) represent the most defensible modeling approaches for datasets with a preponderance of zeros

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tracking/remote monitoring systems using GNSS are a proven method to enhance the safety and security of personnel and vehicles carrying precious or hazardous cargo. While GNSS tracking appears to mitigate some of these threats, if not adequately secured, it can be a double-edged sword allowing adversaries to obtain sensitive shipment and vehicle position data to better coordinate their attacks, and to provide a false sense of security to monitoring centers. Tracking systems must be designed with the ability to perform route-compliance and thwart attacks ranging from low-level attacks such as the cutting of antenna cables to medium and high-level attacks involving radio jamming and signal / data-level simulation, especially where the goods transported have a potentially high value to terrorists. This paper discusses the use of GNSS in critical tracking applications, addressing the mitigation of GNSS security issues, augmentation systems and communication systems in order to provide highly robust and survivable tracking systems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

One of the research focuses in the integer least squares problem is the decorrelation technique to reduce the number of integer parameter search candidates and improve the efficiency of the integer parameter search method. It remains as a challenging issue for determining carrier phase ambiguities and plays a critical role in the future of GNSS high precise positioning area. Currently, there are three main decorrelation techniques being employed: the integer Gaussian decorrelation, the Lenstra–Lenstra–Lovász (LLL) algorithm and the inverse integer Cholesky decorrelation (IICD) method. Although the performance of these three state-of-the-art methods have been proved and demonstrated, there is still a potential for further improvements. To measure the performance of decorrelation techniques, the condition number is usually used as the criterion. Additionally, the number of grid points in the search space can be directly utilized as a performance measure as it denotes the size of search space. However, a smaller initial volume of the search ellipsoid does not always represent a smaller number of candidates. This research has proposed a modified inverse integer Cholesky decorrelation (MIICD) method which improves the decorrelation performance over the other three techniques. The decorrelation performance of these methods was evaluated based on the condition number of the decorrelation matrix, the number of search candidates and the initial volume of search space. Additionally, the success rate of decorrelated ambiguities was calculated for all different methods to investigate the performance of ambiguity validation. The performance of different decorrelation methods was tested and compared using both simulation and real data. The simulation experiment scenarios employ the isotropic probabilistic model using a predetermined eigenvalue and without any geometry or weighting system constraints. MIICD method outperformed other three methods with conditioning improvements over LAMBDA method by 78.33% and 81.67% without and with eigenvalue constraint respectively. The real data experiment scenarios involve both the single constellation system case and dual constellations system case. Experimental results demonstrate that by comparing with LAMBDA, MIICD method can significantly improve the efficiency of reducing the condition number by 78.65% and 97.78% in the case of single constellation and dual constellations respectively. It also shows improvements in the number of search candidate points by 98.92% and 100% in single constellation case and dual constellations case.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The monitoring sites comprising a state of the environment (SOE) network must be carefully selected to ensure that they will be representative of the broader resource. Hierarchical cluster analysis (HCA) is a data-driven technique that can potentially be employed to assess the representativeness of a SOE monitoring network. The objective of this paper is to explore the use of HCA as an approach for assessing the representativeness of the New Zealand National Groundwater Monitoring Programme (NGMP), which is comprised of 110 monitoring sites across the country.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper outlines a feasible scheme to extract deck trend when a rotary-wing unmanned aerial vehicle (RUAV)approaches an oscillating deck. An extended Kalman filter (EKF) is de- veloped to fuse measurements from multiple sensors for effective estimation of the unknown deck heave motion. Also, a recursive Prony Analysis (PA) procedure is proposed to implement online curve-fitting of the estimated heave mo- tion. The proposed PA constructs an appropriate model with parameters identified using the forgetting factor recursive least square (FFRLS)method. The deck trend is then extracted by separating dominant modes. Performance of the proposed procedure is evaluated using real ship motion data, and simulation results justify the suitability of the proposed method into safe landing of RUAVs operating in a maritime environment.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This presentation will deal with the transformations that have occurred in news journalism worldwide in the early 21st century. I will argue that they have been the most significant changes to the profession for 100 years, and the challenges facing the news media industry in responding to them are substantial, as are those facing journalism education. It will develop this argument in relation to the crisis of the newspaper business model, and why social media, blogging and citizen journalism have not filled the gap left by the withdrawal of resources from traditional journalism. It will also draw upon Wikileaks as a case study in debates about computational and data-driven journalism, and whether large-scale "leaks" of electronic documents may be the future of investigative journalism.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cancer poses an undeniable burden to the health and wellbeing of the Australian community. In a recent report commissioned by the Australian Institute for Health and Welfare(AIHW, 2010), one in every two Australians on average will be diagnosed with cancer by the age of 85, making cancer the second leading cause of death in 2007, preceded only by cardiovascular disease. Despite modest decreases in standardised combined cancer mortality over the past few decades, in part due to increased funding and access to screening programs, cancer remains a significant economic burden. In 2010, all cancers accounted for an estimated 19% of the country's total burden of disease, equating to approximately $3:8 billion in direct health system costs (Cancer Council Australia, 2011). Furthermore, there remains established socio-economic and other demographic inequalities in cancer incidence and survival, for example, by indigenous status and rurality. Therefore, in the interests of the nation's health and economic management, there is an immediate need to devise data-driven strategies to not only understand the socio-economic drivers of cancer but also facilitate the implementation of cost-effective resource allocation for cancer management...

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: Effective management of chronic diseases such as prostate cancer is important. Research suggests a tendency to use self-care treatment options such as over-the-counter (OTC) complementary medications among prostate cancer patients. The current trend in patient-driven recording of health data in an online Personal Health Record (PHR) presents an opportunity to develop new data-driven approaches for improving prostate cancer patient care. However, the ability of current online solutions to share patients' data for better decision support is limited. An informatics approach may improve online sharing of self-care interventions among these patients. It can also provide better evidence to support decisions made during their self-managed care. AIMS: To identify requirements for an online system and describe a new case-based reasoning (CBR) method for improving self-care of advanced prostate cancer patients in an online PHR environment. METHOD: A non-identifying online survey was conducted to understand self-care patterns among prostate cancer patients and to identify requirements for an online information system. The pilot study was carried out between August 2010 and December 2010. A case-base of 52 patients was developed. RESULTS: The data analysis showed self-care patterns among the prostate cancer patients. Selenium (55%) was the common complementary supplement used by the patients. Paracetamol (about 45%) was the commonly used OTC by the patients. CONCLUSION: The results of this study specified requirements for an online case-based reasoning information system. The outcomes of this study are being incorporated in design of the proposed Artificial Intelligence (Al) driven patient journey browser system. A basic version of the proposed system is currently being considered for implementation.