816 resultados para Data-Driven Behavior Modeling


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Collecting and analyzing consumer data is essential in today’s data-driven business environment. However, consumers are becoming more aware of the value of the information they can provide to companies, thereby being more reluctant to share it for free. Therefore, companies need to find ways to motivate consumers to disclose personal information. The main research question of the study was formed as “How can companies motivate consumers to disclose personal information?” and it was further divided into two subquestions: 1) What types of benefits motivate consumers to disclose personal information? 2) How does the disclosure context affect the consumers’ information disclosure behavior? The conceptual framework consisted of a classification of extrinsic and intrinsic benefits, and moderating factors, which were recognized on the basis of prior research in the field. The study was conducted by using qualitative research methods. The primary data was collected by interviewing ten representatives from eight companies. The data was analyzed and reported according to predetermined themes. The findings of the study confirm that consumers can be motivated to disclose personal information by offering different types of extrinsic (monetary saving, time saving, self-enhancement, and social adjustment) and intrinsic (novelty, pleasure, and altruism) benefits. However, not all the benefits are equally useful ways to convince the customer to disclose information. Moreover, different factors in the disclosure context can either alleviate or increase the effectiveness of the benefits and the consumers’ motivation to disclose personal information. Such factors include the consumer’s privacy concerns, perceived trust towards the company, the relevancy of the requested information, personalization, website elements (especially security, usability, and aesthetics of a website), and the consumer’s shopping motivation. This study has several contributions. It is essential that companies recognize the most attractive benefits regarding their business and their customers, and that they understand how the disclosure context affects the consumer’s information disclosure behavior. The likelihood of information disclosure can be increased, for example, by offering benefits that meet the consumers’ needs and preferences, improving the relevancy of the asked information, stating the reasons for data collection, creating and maintaining a trustworthy image of the company, and enhancing the quality of the company’s website.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The problem of impostor dataset selection for GMM-based speaker verification is addressed through the recently proposed data-driven background dataset refinement technique. The SVM-based refinement technique selects from a candidate impostor dataset those examples that are most frequently selected as support vectors when training a set of SVMs on a development corpus. This study demonstrates the versatility of dataset refinement in the task of selecting suitable impostor datasets for use in GMM-based speaker verification. The use of refined Z- and T-norm datasets provided performance gains of 15% in EER in the NIST 2006 SRE over the use of heuristically selected datasets. The refined datasets were shown to generalise well to the unseen data of the NIST 2008 SRE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ability to forecast machinery failure is vital to reducing maintenance costs, operation downtime and safety hazards. Recent advances in condition monitoring technologies have given rise to a number of prognostic models for forecasting machinery health based on condition data. Although these models have aided the advancement of the discipline, they have made only a limited contribution to developing an effective machinery health prognostic system. The literature review indicates that there is not yet a prognostic model that directly models and fully utilises suspended condition histories (which are very common in practice since organisations rarely allow their assets to run to failure); that effectively integrates population characteristics into prognostics for longer-range prediction in a probabilistic sense; which deduces the non-linear relationship between measured condition data and actual asset health; and which involves minimal assumptions and requirements. This work presents a novel approach to addressing the above-mentioned challenges. The proposed model consists of a feed-forward neural network, the training targets of which are asset survival probabilities estimated using a variation of the Kaplan-Meier estimator and a degradation-based failure probability density estimator. The adapted Kaplan-Meier estimator is able to model the actual survival status of individual failed units and estimate the survival probability of individual suspended units. The degradation-based failure probability density estimator, on the other hand, extracts population characteristics and computes conditional reliability from available condition histories instead of from reliability data. The estimated survival probability and the relevant condition histories are respectively presented as “training target” and “training input” to the neural network. The trained network is capable of estimating the future survival curve of a unit when a series of condition indices are inputted. Although the concept proposed may be applied to the prognosis of various machine components, rolling element bearings were chosen as the research object because rolling element bearing failure is one of the foremost causes of machinery breakdowns. Computer simulated and industry case study data were used to compare the prognostic performance of the proposed model and four control models, namely: two feed-forward neural networks with the same training function and structure as the proposed model, but neglected suspended histories; a time series prediction recurrent neural network; and a traditional Weibull distribution model. The results support the assertion that the proposed model performs better than the other four models and that it produces adaptive prediction outputs with useful representation of survival probabilities. This work presents a compelling concept for non-parametric data-driven prognosis, and for utilising available asset condition information more fully and accurately. It demonstrates that machinery health can indeed be forecasted. The proposed prognostic technique, together with ongoing advances in sensors and data-fusion techniques, and increasingly comprehensive databases of asset condition data, holds the promise for increased asset availability, maintenance cost effectiveness, operational safety and – ultimately – organisation competitiveness.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Physical activity is a key modifiable behavior impacting a number of important health outcomes. The path to developing chronic diseases commonly commences with lifestyle patterns developed during childhood and adolescence. This study examined whether parent physical activity and other factors correlated with physical activity amongst children are associated with self-reported physical activity in adolescents. Methods: A total of 115 adolescents (aged 12-14) and their parents completed questionnaire assessments. Self-reported physical activity was measured amongst adolescents and their parents using the International Physical Activity Questionnaire for Adolescents (IPAQ-A), and the International Physical Activity Questionnaire (IPAQ) respectively. Adolescents also completed the Children’s Physical Activity Correlates (CPAC), which measured factors that have previously demonstrated association with physical activity amongst children. To examine whether parent physical activity or items from the CPAC were associated with self-reported adolescent physical activity, backward step-wise regression was undertaken. One item was removed at each step in descending order of significance (until two tailed item alpha=0.05 was achieved). Results: A total of 93 (80.9%) adolescents and their parents had complete data sets and were included in the analysis. Independent variables were removed in the order: perceptions of parental role modeling; importance of exercise; perceptions of parental encouragement; peer acceptance; fun of physical exertion; perceived competence; parent physical activity; self-esteem; liking of exercise; and parental influence. The only variable remaining in the model was ‘liking of games and sport’ (p=0.003, adjusted r-squared=0.085). Discussion: These findings indicate that factors associated with self-reported physical activity in adolescents are not necessarily the same as younger children (aged 8-11). While ‘liking of games and sport’ was included in the final model, the r-squared value did not indicate a strong association. Interestingly, parent self-reported physical activity was not included in the final model. It is likely that adolescent physical activity may be influenced by a variety of direct and indirect forms of socialization. These findings do support the view that intrinsically motivated themes such as the liking of games and sport take precedence over outside influences, like those presented by parents, in determining youth physical activity behaviors. These findings do not suggest that parents have no influence on adolescent physical activity patterns, but rather, the influence is likely to be more complex than physical activity behavior modeling perceived by the adolescent. Further research in this field is warranted in order to better understand potential contributors to successful physical activity promotion interventions amongst young adolescents.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The skyrocketing trend for social media on the Internet greatly alters analytical Customer Relationship Management (CRM). Against this backdrop, the purpose of this paper is to advance the conceptual design of Business Intelligence (BI) systems with data identified from social networks. We develop an integrated social network data model, based on an in-depth analysis of Facebook. The data model can inform the design of data warehouses in order to offer new opportunities for CRM analyses, leading to a more consistent and richer picture of customers? characteristics, needs, wants, and demands. Four major contributions are offered. First, Social CRM and Social BI are introduced as emerging fields of research. Second, we develop a conceptual data model to identify and systematize the data available on online social networks. Third, based on the identified data, we design a multidimensional data model as an early contribution to the conceptual design of Social BI systems and demonstrate its application by developing management reports in a retail scenario. Fourth, intellectual challenges for advancing Social CRM and Social BI are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

From human biomonitoring data that are increasingly collected in the United States, Australia, and in other countries from large-scale field studies, we obtain snap-shots of concentration levels of various persistent organic pollutants (POPs) within a cross section of the population at different times. Not only can we observe the trends within this population with time, but we can also gain information going beyond the obvious time trends. By combining the biomonitoring data with pharmacokinetic modeling, we can re-construct the time-variant exposure to individual POPs, determine their intrinsic elimination half-lives in the human body, and predict future levels of POPs in the population. Different approaches have been employed to extract information from human biomonitoring data. Pharmacokinetic (PK) models were combined with longitudinal data1, with single2 or multiple3 average concentrations of a cross-sectional data (CSD), or finally with multiple CSD with or without empirical exposure data4. In the latter study, for the first time, the authors based their modeling outputs on two sets of CSD and empirical exposure data, which made it possible that their model outputs were further constrained due to the extensive body of empirical measurements. Here we use a PK model to analyze recent levels of PBDE concentrations measured in the Australian population. In this study, we are able to base our model results on four sets5-7 of CSD; we focus on two PBDE congeners that have been shown3,5,8-9 to differ in intake rates and half-lives with BDE-47 being associated with high intake rates and a short half-life and BDE-153 with lower intake rates and a longer half-life. By fitting the model to PBDE levels measured in different age groups in different years, we determine the level of intake of BDE-47 and BDE-153, as well as the half-lives of these two chemicals in the Australian population.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of Wireless Sensor Networks (WSNs) for Structural Health Monitoring (SHM) has become a promising approach due to many advantages such as low cost, fast and flexible deployment. However, inherent technical issues such as data synchronization error and data loss have prevented these distinct systems from being extensively used. Recently, several SHM-oriented WSNs have been proposed and believed to be able to overcome a large number of technical uncertainties. Nevertheless, there is limited research verifying the applicability of those WSNs with respect to demanding SHM applications like modal analysis and damage identification. This paper first presents a brief review of the most inherent uncertainties of the SHM-oriented WSN platforms and then investigates their effects on outcomes and performance of the most robust Output-only Modal Analysis (OMA) techniques when employing merged data from multiple tests. The two OMA families selected for this investigation are Frequency Domain Decomposition (FDD) and Data-driven Stochastic Subspace Identification (SSI-data) due to the fact that they both have been widely applied in the past decade. Experimental accelerations collected by a wired sensory system on a large-scale laboratory bridge model are initially used as clean data before being contaminated by different data pollutants in sequential manner to simulate practical SHM-oriented WSN uncertainties. The results of this study show the robustness of FDD and the precautions needed for SSI-data family when dealing with SHM-WSN uncertainties. Finally, the use of the measurement channel projection for the time-domain OMA techniques and the preferred combination of the OMA techniques to cope with the SHM-WSN uncertainties is recommended.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Within the QUT Business School (QUTBS)– researchers across economics, finance and accounting depend on data driven research. They analyze historic and global financial data across a range of instruments to understand the relationships and effects between them as they respond to news and events in their region. Scholars and Higher Degree Research Students in turn seek out universities which offer these particular datasets to further their research. This involves downloading and manipulating large datasets, often with a focus on depth of detail, frequency and long tail historical data. This is stock exchange data and has potential commercial value therefore the license for access tends to be very expensive. This poster reports the following findings: •The library has a part to play in freeing up researchers from the burden of negotiating subscriptions, fundraising and managing the legal requirements around license and access. •The role of the library is to communicate the nature and potential of these complex resources across the university to disciplines as diverse as Mathematics, Health, Information Systems and Creative Industries. •Has demonstrated clear concrete support for research by QUT Library and built relationships into faculty. It has made data available to all researchers and attracted new HDRs. The aim is to reach the output threshold of research outputs to submit into FOR Code 1502 (Banking, Finance and Investment) for ERA 2015. •It is difficult to identify what subset of dataset will be obtained given somewhat vague price tiers. •The integrity of data is variable as it is limited by the way it is collected, this occasionally raises issues for researchers(Cook, Campbell, & Kelly, 2012) •Improved library understanding of the content of our products and the nature of financial based research is a necessary part of the service.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of Wireless Sensor Networks (WSNs) for vibration-based Structural Health Monitoring (SHM) has become a promising approach due to many advantages such as low cost, fast and flexible deployment. However, inherent technical issues such as data asynchronicity and data loss have prevented these distinct systems from being extensively used. Recently, several SHM-oriented WSNs have been proposed and believed to be able to overcome a large number of technical uncertainties. Nevertheless, there is limited research verifying the applicability of those WSNs with respect to demanding SHM applications like modal analysis and damage identification. Based on a brief review, this paper first reveals that Data Synchronization Error (DSE) is the most inherent factor amongst uncertainties of SHM-oriented WSNs. Effects of this factor are then investigated on outcomes and performance of the most robust Output-only Modal Analysis (OMA) techniques when merging data from multiple sensor setups. The two OMA families selected for this investigation are Frequency Domain Decomposition (FDD) and data-driven Stochastic Subspace Identification (SSI-data) due to the fact that they both have been widely applied in the past decade. Accelerations collected by a wired sensory system on a large-scale laboratory bridge model are initially used as benchmark data after being added with a certain level of noise to account for the higher presence of this factor in SHM-oriented WSNs. From this source, a large number of simulations have been made to generate multiple DSE-corrupted datasets to facilitate statistical analyses. The results of this study show the robustness of FDD and the precautions needed for SSI-data family when dealing with DSE at a relaxed level. Finally, the combination of preferred OMA techniques and the use of the channel projection for the time-domain OMA technique to cope with DSE are recommended.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Echology: Making Sense of Data initiative seeks to break new ground in arts practice by asking artists to innovate with respect to a) the possible forms of data representation in public art and b) the artist's role in engaging publics on environmental sustainability in new urban developments. Initiated by ANAT and Carbon Arts in 2011, Echology has seen three artists selected by National competition in 2012 for Lend Lease sites across Australia. In 2013 commissioning of one of these works, the Mussel Choir by Natalie Jeremijenko, began in Melbourne's Victoria Harbour development. This emerging practice of data - driven and environmentally engaged public artwork presents multiple challenges to established systems of public arts production and management, at the same time as offering up new avenues for artists to forge new modes of collaboration. The experience of Echology and in particular, the Mussel Choir is examined here to reveal opportunities for expansion of this practice through identification of the factors that lead to a resilient 'ecology of part nership' between stakeholders that include science and technology researchers, education providers, city administrators, and urban developers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reviews the use of multi-agent systems to model the impacts of high levels of photovoltaic (PV) system penetration in distribution networks and presents some preliminary data obtained from the Perth Solar City high penetration PV trial. The Perth Solar City trial consists of a low voltage distribution feeder supplying 75 customers where 29 consumers have roof top photovoltaic systems. Data is collected from smart meters at each consumer premises, from data loggers at the transformer low voltage (LV) side and from a nearby distribution network SCADA measurement point on the high voltage side (HV) side of the transformer. The data will be used to progressively develop MAS models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The hemodynamic response function (HRF) describes the local response of brain vasculature to functional activation. Accurate HRF modeling enables the investigation of cerebral blood flow regulation and improves our ability to interpret fMRI results. Block designs have been used extensively as fMRI paradigms because detection power is maximized; however, block designs are not optimal for HRF parameter estimation. Here we assessed the utility of block design fMRI data for HRF modeling. The trueness (relative deviation), precision (relative uncertainty), and identifiability (goodness-of-fit) of different HRF models were examined and test-retest reproducibility of HRF parameter estimates was assessed using computer simulations and fMRI data from 82 healthy young adult twins acquired on two occasions 3 to 4 months apart. The effects of systematically varying attributes of the block design paradigm were also examined. In our comparison of five HRF models, the model comprising the sum of two gamma functions with six free parameters had greatest parameter accuracy and identifiability. Hemodynamic response function height and time to peak were highly reproducible between studies and width was moderately reproducible but the reproducibility of onset time was low. This study established the feasibility and test-retest reliability of estimating HRF parameters using data from block design fMRI studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since 2006, we have been conducting urban informatics research that we define as “the study, design, and practice of urban experiences across different urban contexts that are created by new opportunities of real-time, ubiquitous technology and the augmentation that mediates the physical and digital layers of people networks and urban infrastructures” [1]. Various new research initiatives under the label “urban informatics” have been started since then by universities (e.g., NYU’s Center for Urban Science and Progress) and industry (e.g., Arup, McKinsey) worldwide. Yet, many of these new initiatives are limited to what Townsend calls, “data-driven approaches to urban improvement” [2]. One of the key challenges is that any quantity of aggregated data does not easily translate directly into quality insights to better understand cities. In this talk, I will raise questions about the purpose of urban informatics research beyond data, and show examples of media architecture, participatory city making, and citizen activism. I argue for (1) broadening the disciplinary foundations that urban science approaches draw on; (2) maintaining a hybrid perspective that considers both the bird’s eye view as well as the citizen’s view, and; (3) employing design research to not be limited to just understanding, but to bring about actionable knowledge that will drive change for good.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Digital technology offers enormous benefits (economic, quality of design and efficiency in use) if adopted to implement integrated ways of representing the physical world in a digital form. When applied across the full extent of the built and natural world, it is referred to as the Digital Built Environment (DBE) and encompasses a wide range of approaches and technology initiatives, all aimed at the same end goal: the development of a virtual world that sufficiently mirrors the real world to form the basis for the smart cities of the present and future, enable efficient infrastructure design and programmed maintenance, and create a new foundation for economic growth and social well-being through evidence-based analysis. The creation of a National Data Policy for the DBE will facilitate the creation of additional high technology industries in Australia; provide Governments, industries and citizens with greater knowledge of the environments they occupy and plan; and offer citizen-driven innovations for the future. Australia has slipped behind other nations in the adoption and execution of Building Information Modelling (BIM) and the principal concern is that the gap is widening. Data driven innovation added $67 billion to the Australian economy in 20131. Strong open data policy equates to $16 billion in new value2. Australian Government initiatives such as the Digital Earth inspired “National Map” offer a platform and pathway to embrace the concept of a “BIM Globe”, while also leveraging unprecedented growth in open source / open data collaboration. Australia must address the challenges by learning from international experiences—most notably the UK and NZ—and mandate the use of BIM across Government, extending the Framework for Spatial Data Foundation to include the Built Environment as a theme and engaging collaboration through a “BIM globe” metaphor. This proposed DBE strategy will modernise the Australian urban planning and the construction industry. It will change the way we develop our cities by fundamentally altering the dynamics and behaviours of the supply chains and unlocking new and more efficient ways of collaborating at all stages of the project life-cycle. There are currently two major modelling approaches that contribute to the challenge of delivering the DBE. Though these collectively encompass many (often competing) approaches or proprietary software systems, all can be categorised as either: a spatial modelling approach, where the focus is generally on representing the elements that make up the world within their geographic context; and a construction modelling approach, where the focus is on models that support the life cycle management of the built environment. These two approaches have tended to evolve independently, addressing two broad industry sectors: the one concerned with understanding and managing global and regional aspects of the world that we inhabit, including disciplines concerned with climate, earth sciences, land ownership, urban and regional planning and infrastructure management; the other is concerned with planning, design, construction and operation of built facilities and includes architectural and engineering design, product manufacturing, construction, facility management and related disciplines (a process/technology commonly known as Building Information Modelling, BIM). The spatial industries have a strong voice in the development of public policy in Australia, while the construction sector, which in 2014 accounted for around 8.5% of Australia’s GDP3, has no single voice and because of its diversity, is struggling to adapt to and take advantage of the opportunity presented by these digital technologies. The experience in the UK over the past few years has demonstrated that government leadership is very effective in stimulating industry adoption of digital technologies by, on the one hand, mandating the use of BIM on public procurement projects while at the same time, providing comparatively modest funding to address the common issues that confront the industry in adopting that way of working across the supply chain. The reported result has been savings of £840m in construction costs in 2013/14 according to UK Cabinet Office figures4. There is worldwide recognition of the value of bringing these two modelling technologies together. Australia has the expertise to exercise leadership in this work, but it requires a commitment by government to recognise the importance of BIM as a companion methodology to the spatial technologies so that these two disciplinary domains can cooperate in the development of data policies and information exchange standards to smooth out common workflows. buildingSMART Australasia, SIBA and their academic partners have initiated this dialogue in Australia and wish to work collaboratively, with government support and leadership, to explore the opportunities open to us as we develop an Australasian Digital Built Environment. As part of that programme, we must develop and implement a strategy to accelerate the adoption of BIM processes across the Australian construction sector while at the same time, developing an integrated approach in concert with the spatial sector that will position Australia at the forefront of international best practice in this area. Australia and New Zealand cannot afford to be on the back foot as we face the challenges of rapid urbanisation and change in the global environment. Although we can identify some exemplary initiatives in this area, particularly in New Zealand in response to the need for more resilient urban development in the face of earthquake threats, there is still much that needs to be done. We are well situated in the Asian region to take a lead in this challenge, but we are at imminent risk of losing the initiative if we do not take action now. Strategic collaboration between Governments, Industry and Academia will create new jobs and wealth, with the potential, for example, to save around 20% on the delivery costs of new built assets, based on recent UK estimates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Robust estimation often relies on a dispersion function that is more slowly varying at large values than the square function. However, the choice of tuning constant in dispersion functions may impact the estimation efficiency to a great extent. For a given family of dispersion functions such as the Huber family, we suggest obtaining the "best" tuning constant from the data so that the asymptotic efficiency is maximized. This data-driven approach can automatically adjust the value of the tuning constant to provide the necessary resistance against outliers. Simulation studies show that substantial efficiency can be gained by this data-dependent approach compared with the traditional approach in which the tuning constant is fixed. We briefly illustrate the proposed method using two datasets.