909 resultados para Data-Driven Behavior Modeling


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Summaries of the data gathered for this project.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper discusses areas for future research opportunities by addressing accounting issues faced by management accountants practicing in hospitality organizations. Specifically, the article focuses on the use of the uniform system of accounts by operating properties, the usefulness of allocating support costs to operated departments, extending our understanding of operating costs and performance measurement systems and the certification of practicing accountants.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We analyze a real data set pertaining to reindeer fecal pellet-group counts obtained from a survey conducted in a forest area in northern Sweden. In the data set, over 70% of counts are zeros, and there is high spatial correlation. We use conditionally autoregressive random effects for modeling of spatial correlation in a Poisson generalized linear mixed model (GLMM), quasi-Poisson hierarchical generalized linear model (HGLM), zero-inflated Poisson (ZIP), and hurdle models. The quasi-Poisson HGLM allows for both under- and overdispersion with excessive zeros, while the ZIP and hurdle models allow only for overdispersion. In analyzing the real data set, we see that the quasi-Poisson HGLMs can perform better than the other commonly used models, for example, ordinary Poisson HGLMs, spatial ZIP, and spatial hurdle models, and that the underdispersed Poisson HGLMs with spatial correlation fit the reindeer data best. We develop R codes for fitting these models using a unified algorithm for the HGLMs. Spatial count response with an extremely high proportion of zeros, and underdispersion can be successfully modeled using the quasi-Poisson HGLM with spatial random effects.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This dissertation contains four essays that all share a common purpose: developing new methodologies to exploit the potential of high-frequency data for the measurement, modeling and forecasting of financial assets volatility and correlations. The first two chapters provide useful tools for univariate applications while the last two chapters develop multivariate methodologies. In chapter 1, we introduce a new class of univariate volatility models named FloGARCH models. FloGARCH models provide a parsimonious joint model for low frequency returns and realized measures, and are sufficiently flexible to capture long memory as well as asymmetries related to leverage effects. We analyze the performances of the models in a realistic numerical study and on the basis of a data set composed of 65 equities. Using more than 10 years of high-frequency transactions, we document significant statistical gains related to the FloGARCH models in terms of in-sample fit, out-of-sample fit and forecasting accuracy compared to classical and Realized GARCH models. In chapter 2, using 12 years of high-frequency transactions for 55 U.S. stocks, we argue that combining low-frequency exogenous economic indicators with high-frequency financial data improves the ability of conditionally heteroskedastic models to forecast the volatility of returns, their full multi-step ahead conditional distribution and the multi-period Value-at-Risk. Using a refined version of the Realized LGARCH model allowing for time-varying intercept and implemented with realized kernels, we document that nominal corporate profits and term spreads have strong long-run predictive ability and generate accurate risk measures forecasts over long-horizon. The results are based on several loss functions and tests, including the Model Confidence Set. Chapter 3 is a joint work with David Veredas. We study the class of disentangled realized estimators for the integrated covariance matrix of Brownian semimartingales with finite activity jumps. These estimators separate correlations and volatilities. We analyze different combinations of quantile- and median-based realized volatilities, and four estimators of realized correlations with three synchronization schemes. Their finite sample properties are studied under four data generating processes, in presence, or not, of microstructure noise, and under synchronous and asynchronous trading. The main finding is that the pre-averaged version of disentangled estimators based on Gaussian ranks (for the correlations) and median deviations (for the volatilities) provide a precise, computationally efficient, and easy alternative to measure integrated covariances on the basis of noisy and asynchronous prices. Along these lines, a minimum variance portfolio application shows the superiority of this disentangled realized estimator in terms of numerous performance metrics. Chapter 4 is co-authored with Niels S. Hansen, Asger Lunde and Kasper V. Olesen, all affiliated with CREATES at Aarhus University. We propose to use the Realized Beta GARCH model to exploit the potential of high-frequency data in commodity markets. The model produces high quality forecasts of pairwise correlations between commodities which can be used to construct a composite covariance matrix. We evaluate the quality of this matrix in a portfolio context and compare it to models used in the industry. We demonstrate significant economic gains in a realistic setting including short selling constraints and transaction costs.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Following the workshop on new developments in daily licensing practice in November 2011, we brought together fourteen representatives from national consortia (from Denmark, Germany, Netherlands and the UK) and publishers (Elsevier, SAGE and Springer) met in Copenhagen on 9 March 2012 to discuss provisions in licences to accommodate new developments. The one day workshop aimed to: present background and ideas regarding the provisions KE Licensing Expert Group developed; introduce and explain the provisions the invited publishers currently use;ascertain agreement on the wording for long term preservation, continuous access and course packs; give insight and more clarity about the use of open access provisions in licences; discuss a roadmap for inclusion of the provisions in the publishers’ licences; result in report to disseminate the outcome of the meeting. Participants of the workshop were: United Kingdom: Lorraine Estelle (Jisc Collections) Denmark: Lotte Eivor Jørgensen (DEFF), Lone Madsen (Southern University of Denmark), Anne Sandfær (DEFF/Knowledge Exchange) Germany: Hildegard Schaeffler (Bavarian State Library), Markus Brammer (TIB) The Netherlands: Wilma Mossink (SURF), Nol Verhagen (University of Amsterdam), Marc Dupuis (SURF/Knowledge Exchange) Publishers: Alicia Wise (Elsevier), Yvonne Campfens (Springer), Bettina Goerner (Springer), Leo Walford (Sage) Knowledge Exchange: Keith Russell The main outcome of the workshop was that it would be valuable to have a standard set of clauses which could used in negotiations, this would make concluding licences a lot easier and more efficient. The comments on the model provisions the Licensing Expert group had drafted will be taken into account and the provisions will be reformulated. Data and text mining is a new development and demand for access to allow for this is growing. It would be easier if there was a simpler way to access materials so they could be more easily mined. However there are still outstanding questions on how authors of articles that have been mined can be properly attributed.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Experimental and analytical studies were conducted to explore thermo-acoustic coupling during the onset of combustion instability in various air-breathing combustor configurations. These include a laboratory-scale 200-kW dump combustor and a 100-kW augmentor featuring a v-gutter flame holder. They were used to simulate main combustion chambers and afterburners in aero engines, respectively. The three primary themes of this work includes: 1) modeling heat release fluctuations for stability analysis, 2) conducting active combustion control with alternative fuels, and 3) demonstrating practical active control for augmentor instability suppression. The phenomenon of combustion instabilities remains an unsolved problem in propulsion engines, mainly because of the difficulty in predicting the fluctuating component of heat release without extensive testing. A hybrid model was developed to describe both the temporal and spatial variations in dynamic heat release, using a separation of variables approach that requires only a limited amount of experimental data. The use of sinusoidal basis functions further reduced the amount of data required. When the mean heat release behavior is known, the only experimental data needed for detailed stability analysis is one instantaneous picture of heat release at the peak pressure phase. This model was successfully tested in the dump combustor experiments, reproducing the correct sign of the overall Rayleigh index as well as the remarkably accurate spatial distribution pattern of fluctuating heat release. Active combustion control was explored for fuel-flexible combustor operation using twelve different jet fuels including bio-synthetic and Fischer-Tropsch types. Analysis done using an actuated spray combustion model revealed that the combustion response times of these fuels were similar. Combined with experimental spray characterizations, this suggested that controller performance should remain effective with various alternative fuels. Active control experiments validated this analysis while demonstrating 50-70\% reduction in the peak spectral amplitude. A new model augmentor was built and tested for combustion dynamics using schlieren and chemiluminescence techniques. Novel active control techniques including pulsed air injection were implemented and the results were compared with the pulsed fuel injection approach. The pulsed injection of secondary air worked just as effectively for suppressing the augmentor instability, setting up the possibility of more efficient actuation strategy.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Symbolic execution is a powerful program analysis technique, but it is very challenging to apply to programs built using event-driven frameworks, such as Android. The main reason is that the framework code itself is too complex to symbolically execute. The standard solution is to manually create a framework model that is simpler and more amenable to symbolic execution. However, developing and maintaining such a model by hand is difficult and error-prone. We claim that we can leverage program synthesis to introduce a high-degree of automation to the process of framework modeling. To support this thesis, we present three pieces of work. First, we introduced SymDroid, a symbolic executor for Android. While Android apps are written in Java, they are compiled to Dalvik bytecode format. Instead of analyzing an app’s Java source, which may not be available, or decompiling from Dalvik back to Java, which requires significant engineering effort and introduces yet another source of potential bugs in an analysis, SymDroid works directly on Dalvik bytecode. Second, we introduced Pasket, a new system that takes a first step toward automatically generating Java framework models to support symbolic execution. Pasket takes as input the framework API and tutorial programs that exercise the framework. From these artifacts and Pasket's internal knowledge of design patterns, Pasket synthesizes an executable framework model by instantiating design patterns, such that the behavior of a synthesized model on the tutorial programs matches that of the original framework. Lastly, in order to scale program synthesis to framework models, we devised adaptive concretization, a novel program synthesis algorithm that combines the best of the two major synthesis strategies: symbolic search, i.e., using SAT or SMT solvers, and explicit search, e.g., stochastic enumeration of possible solutions. Adaptive concretization parallelizes multiple sub-synthesis problems by partially concretizing highly influential unknowns in the original synthesis problem. Thanks to adaptive concretization, Pasket can generate a large-scale model, e.g., thousands lines of code. In addition, we have used an Android model synthesized by Pasket and found that the model is sufficient to allow SymDroid to execute a range of apps.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The analysis of steel and composite frames has traditionally been carried out by idealizing beam-to-column connections as either rigid or pinned. Although some advanced analysis methods have been proposed to account for semi-rigid connections, the performance of these methods strongly depends on the proper modeling of connection behavior. The primary challenge of modeling beam-to-column connections is their inelastic response and continuously varying stiffness, strength, and ductility. In this dissertation, two distinct approaches—mathematical models and informational models—are proposed to account for the complex hysteretic behavior of beam-to-column connections. The performance of the two approaches is examined and is then followed by a discussion of their merits and deficiencies. To capitalize on the merits of both mathematical and informational representations, a new approach, a hybrid modeling framework, is developed and demonstrated through modeling beam-to-column connections. Component-based modeling is a compromise spanning two extremes in the field of mathematical modeling: simplified global models and finite element models. In the component-based modeling of angle connections, the five critical components of excessive deformation are identified. Constitutive relationships of angles, column panel zones, and contact between angles and column flanges, are derived by using only material and geometric properties and theoretical mechanics considerations. Those of slip and bolt hole ovalization are simplified by empirically-suggested mathematical representation and expert opinions. A mathematical model is then assembled as a macro-element by combining rigid bars and springs that represent the constitutive relationship of components. Lastly, the moment-rotation curves of the mathematical models are compared with those of experimental tests. In the case of a top-and-seat angle connection with double web angles, a pinched hysteretic response is predicted quite well by complete mechanical models, which take advantage of only material and geometric properties. On the other hand, to exhibit the highly pinched behavior of a top-and-seat angle connection without web angles, a mathematical model requires components of slip and bolt hole ovalization, which are more amenable to informational modeling. An alternative method is informational modeling, which constitutes a fundamental shift from mathematical equations to data that contain the required information about underlying mechanics. The information is extracted from observed data and stored in neural networks. Two different training data sets, analytically-generated and experimental data, are tested to examine the performance of informational models. Both informational models show acceptable agreement with the moment-rotation curves of the experiments. Adding a degradation parameter improves the informational models when modeling highly pinched hysteretic behavior. However, informational models cannot represent the contribution of individual components and therefore do not provide an insight into the underlying mechanics of components. In this study, a new hybrid modeling framework is proposed. In the hybrid framework, a conventional mathematical model is complemented by the informational methods. The basic premise of the proposed hybrid methodology is that not all features of system response are amenable to mathematical modeling, hence considering informational alternatives. This may be because (i) the underlying theory is not available or not sufficiently developed, or (ii) the existing theory is too complex and therefore not suitable for modeling within building frame analysis. The role of informational methods is to model aspects that the mathematical model leaves out. Autoprogressive algorithm and self-learning simulation extract the missing aspects from a system response. In a hybrid framework, experimental data is an integral part of modeling, rather than being used strictly for validation processes. The potential of the hybrid methodology is illustrated through modeling complex hysteretic behavior of beam-to-column connections. Mechanics-based components of deformation such as angles, flange-plates, and column panel zone, are idealized to a mathematical model by using a complete mechanical approach. Although the mathematical model represents envelope curves in terms of initial stiffness and yielding strength, it is not capable of capturing the pinching effects. Pinching is caused mainly by separation between angles and column flanges as well as slip between angles/flange-plates and beam flanges. These components of deformation are suitable for informational modeling. Finally, the moment-rotation curves of the hybrid models are validated with those of the experimental tests. The comparison shows that the hybrid models are capable of representing the highly pinched hysteretic behavior of beam-to-column connections. In addition, the developed hybrid model is successfully used to predict the behavior of a newly-designed connection.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Discovery Driven Analysis (DDA) is a common feature of OLAP technology to analyze structured data. In essence, DDA helps analysts to discover anomalous data by highlighting 'unexpected' values in the OLAP cube. By giving indications to the analyst on what dimensions to explore, DDA speeds up the process of discovering anomalies and their causes. However, Discovery Driven Analysis (and OLAP in general) is only applicable on structured data, such as records in databases. We propose a system to extend DDA technology to semi-structured text documents, that is, text documents with a few structured data. Our system pipeline consists of two stages: first, the text part of each document is structured around user specified dimensions, using semi-PLSA algorithm; then, we adapt DDA to these fully structured documents, thus enabling DDA on text documents. We present some applications of this system in OLAP analysis and show how scalability issues are solved. Results show that our system can handle reasonable datasets of documents, in real time, without any need for pre-computation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Deficits in social communication and interaction have been identified as distinguishing impairments for individuals with an autism spectrum disorder (ASD). As a pivotal skill, the successful development of social communication and interaction in individuals with ASD is a lifelong objective. Point-of-view video modeling has the potential to address these deficits. This type of video involves filming the completion of a targeted skill or behavior from a first-person perspective. By presenting only what a person might see from his or her viewpoint, it has been identified to be more effective in limiting irrelevant stimuli by providing a clear frame of reference to facilitate imitation. The current study investigated the use of point-of-view video modeling in teaching social initiations (e.g., greetings). Using a multiple baseline across participants design, five kindergarten participants were taught social initiations using point-of-view video modeling and video priming. Immediately before and after viewing the entire point-of-view video model, the participants were evaluated on their social initiations with a trained, typically developing peer serving as a communication partner. Specifically, the social initiations involved participants’ abilities to shift their attention toward the peer who entered the classroom, maintain attention toward the peer, and engage in an appropriate social initiation (e.g., hi, hello). Both generalization and maintenance were tested. Overall, the data suggest point-of-view video modeling is an effective intervention for increasing social initiations in young students with ASD. However, retraining was necessary for acquisition of skills in the classroom environment. Generalization in novel environments and with a novel communication partner, and generalization to other social initiation skills was limited. Additionally, maintenance of gained social initiation skills only occurred in the intervention room. Despite the limitations of the study and variable results, there are a number of implications moving forward for both practitioners and future researchers examining point-of-view modeling and its potential impact on the social initiation skills of individuals with ASD.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Terrestrial planets produce crusts as they differentiate. The Earth’s bi-modal crust, with a high-standing granitic continental crust and a low-standing basaltic oceanic crust, is unique in our solar system and links the evolution of the interior and exterior of this planet. Here I present geochemical observations to constrain processes accompanying crustal formation and evolution. My approach includes geochemical analyses, quantitative modeling, and experimental studies. The Archean crustal evolution project represents my perspective on when Earth’s continental crust began forming. In this project, I utilized critical element ratios in sedimentary records to track the evolution of the MgO content in the upper continental crust as a function time. The early Archean subaerial crust had >11 wt. % MgO, whereas by the end of Archean its composition had evolved to about 4 wt. % MgO, suggesting a transition of the upper crust from a basalt-like to a more granite-like bulk composition. Driving this fundamental change of the upper crustal composition is the widespread operation of subduction processes, suggesting the onset of global plate tectonics at ~ 3 Ga (Abstract figure). Three of the chapters in this dissertation leverage the use of Eu anomalies to track the recycling of crustal materials back into the mantle, where Eu anomaly is a sensitive measure of the element’s behavior relative to neighboring lanthanoids (Sm and Gd) during crustal differentiation. My compilation of Sm-Eu-Gd data for the continental crust shows that the average crust has a net negative Eu anomaly. This result requires recycling of Eu-enriched lower continental crust to the mantle. Mass balance calculations require that about three times the mass of the modern continental crust was returned into the mantle over Earth history, possibly via density-driven recycling. High precision measurements of Eu/Eu* in selected primitive glasses of mid-ocean ridge basalt (MORB) from global MORs, combined with numerical modeling, suggests that the recycled lower crustal materials are not found within the MORB source and may have at least partially sank into the lower mantle where they can be sampled by hot spot volcanoes. The Lesser Antilles Li isotope project provides insights into the Li systematics of this young island arc, a representative section of proto-continental crust. Martinique Island lavas, to my knowledge, represent the only clear case in which crustal Li is recycled back into their mantle source, as documented by the isotopically light Li isotopes in Lesser Antilles sediments that feed into the fore arc subduction trench. By corollary, the mantle-like Li signal in global arc lavas is likely the result of broadly similar Li isotopic compositions between the upper mantle and bulk subducting sediments in most arcs. My PhD project on Li diffusion mechanism in zircon is being carried out in extensive collaboration with multiple institutes and employs analytical, experimental and modeling studies. This ongoing project, finds that REE and Y play an important role in controlling Li diffusion in natural zircons, with Li partially coupling to REE and Y to maintain charge balance. Access to state-of-art instrumentation presented critical opportunities to identify the mechanisms that cause elemental fractionation during laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) analysis. My work here elucidates the elemental fractionation associated with plasma plume condensation during laser ablation and particle-ion conversion in the ICP.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The protein lysate array is an emerging technology for quantifying the protein concentration ratios in multiple biological samples. It is gaining popularity, and has the potential to answer questions about post-translational modifications and protein pathway relationships. Statistical inference for a parametric quantification procedure has been inadequately addressed in the literature, mainly due to two challenges: the increasing dimension of the parameter space and the need to account for dependence in the data. Each chapter of this thesis addresses one of these issues. In Chapter 1, an introduction to the protein lysate array quantification is presented, followed by the motivations and goals for this thesis work. In Chapter 2, we develop a multi-step procedure for the Sigmoidal models, ensuring consistent estimation of the concentration level with full asymptotic efficiency. The results obtained in this chapter justify inferential procedures based on large-sample approximations. Simulation studies and real data analysis are used to illustrate the performance of the proposed method in finite-samples. The multi-step procedure is simpler in both theory and computation than the single-step least squares method that has been used in current practice. In Chapter 3, we introduce a new model to account for the dependence structure of the errors by a nonlinear mixed effects model. We consider a method to approximate the maximum likelihood estimator of all the parameters. Using the simulation studies on various error structures, we show that for data with non-i.i.d. errors the proposed method leads to more accurate estimates and better confidence intervals than the existing single-step least squares method.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Chiasma and crossover are two related biological processes of great importance in the understanding genetic variation. The study of these processes is straightforward in organisms where all products of meiosis are recovered and can be observed. This is not the case in mammals. Our understanding of these processes depends on our ability to model them. In this study I describe the biological processes that underline chiasma and crossover as well as the two main inference problems associated with these processes: i) in mammals we only recover one of the four products of meiosis and, ii) in general, we do not observe where the crossovers actually happen, but we find an interval containing type-2 censored information. NPML estimate was proposed and used in this work and used to compare chromosome length and chromosome expansion through the crosses.