881 resultados para Regression-based decomposition.


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tourist accommodation expenditure is a widely investigated topic as it represents a major contribution to the total tourist expenditure. The identification of the determinant factors is commonly based on supply-driven applications while little research has been made on important travel characteristics. This paper proposes a demand-driven analysis of tourist accommodation price by focusing on data generated from room bookings. The investigation focuses on modeling the relationship between key travel characteristics and the price paid to book the accommodation. To accommodate the distributional characteristics of the expenditure variable, the analysis is based on the estimation of a quantile regression model. The findings support the econometric approach used and enable the elaboration of relevant managerial implications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The examination of Workplace Aggression as a global construct conceptualization has gained considerable attention over the past few years as organizations work to better understand and address the occurrence and consequences of this challenging construct. The purpose of this dissertation is to build on previous efforts to validate the appropriateness and usefulness of a global conceptualization of the workplace aggression construct. ^ This dissertation has been broken up into two parts: Part 1 utilized a Confirmatory Factor Analysis approach in order to assess the existence of workplace aggression as a global construct; Part 2 utilized a series of correlational analyses to examine the relationship between a selection of commonly experienced individual strain based outcomes and the global construct conceptualization assessed in Part 1. Participants were a diverse sample of 219 working individuals from Amazon’s Mechanical Turk participant pool. ^ Results of Part 1 did not show support for a one-factor global construct conceptualization of the workplace aggression construct. However, support was shown for a higher-order five-factor model of the construct, suggesting that it may be possible to conceptualize workplace aggression as an overarching construct that is made up of separate workplace aggression constructs. Results of Part 2 showed support for the relationships between an existing global construct workplace aggression conceptualization and a series of strain-based outcomes. Utilizing correlational analyses, additional post-hoc analyses showed that individual factors such as emotional intelligence and personality are related to the experience of workplace aggression. Further, utilizing moderated regression analysis, the results demonstrated that individuals experiencing high levels of workplace aggression reported higher job satisfaction when they felt strongly that the aggressive act was highly visible, and similarly, when they felt that there was a clear intent to cause harm. ^ Overall, the findings of this dissertation do support the need for a simplification of its current state of measurement. Future research should continue to examine workplace aggression in an effort to shed additional light on the structure and usefulness of this complex construct.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to explore the relationship between faculty perceptions, selected demographics, implementation of elements of transactional distance theory and online web-based course completion rates. This theory posits that the high transactional distance of online courses makes it difficult for students to complete these courses successfully; too often this is associated with low completion rates. Faculty members play an indispensable role in course design, whether online or face-to-face. They also influence course delivery format from design through implementation and ultimately to how students will experience the course. This study used transactional distance theory as the conceptual framework to examine the relationship between teaching and learning strategies used by faculty members to help students complete online courses. Faculty members’ sex, number of years teaching online at the college, and their online course completion rates were considered. A researcher-developed survey was used to collect data from 348 faculty members who teach online at two prominent colleges in the southeastern part of United States. An exploratory factor analysis resulted in six factors related to transactional distance theory. The factors accounted for slightly over 65% of the variance of transactional distance scores as measured by the survey instrument. Results provided support for Moore’s (1993) theory of transactional distance. Female faculty members scored higher in all the factors of transactional distance theory when compared to men. Faculty number of years teaching online at the college level correlated significantly with all the elements of transactional distance theory. Regression analysis was used to determine that two of the factors, instructor interface and instructor-learner interaction, accounted for 12% of the variance in student online course completion rates. In conclusion, of the six factors found, the two with the highest percentage scores were instructor interface and instructor-learner interaction. This finding, while in alignment with the literature concerning the dialogue element of transactional distance theory, brings a special interest to the importance of instructor interface as a factor. Surprisingly, based on the reviewed literature on transactional distance theory, faculty perceptions concerning learner-learner interaction was not an important factor and there was no learner-content interaction factor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Adaptability and invisibility are hallmarks of modern terrorism, and keeping pace with its dynamic nature presents a serious challenge for societies throughout the world. Innovations in computer science have incorporated applied mathematics to develop a wide array of predictive models to support the variety of approaches to counterterrorism. Predictive models are usually designed to forecast the location of attacks. Although this may protect individual structures or locations, it does not reduce the threat—it merely changes the target. While predictive models dedicated to events or social relationships receive much attention where the mathematical and social science communities intersect, models dedicated to terrorist locations such as safe-houses (rather than their targets or training sites) are rare and possibly nonexistent. At the time of this research, there were no publically available models designed to predict locations where violent extremists are likely to reside. This research uses France as a case study to present a complex systems model that incorporates multiple quantitative, qualitative and geospatial variables that differ in terms of scale, weight, and type. Though many of these variables are recognized by specialists in security studies, there remains controversy with respect to their relative importance, degree of interaction, and interdependence. Additionally, some of the variables proposed in this research are not generally recognized as drivers, yet they warrant examination based on their potential role within a complex system. This research tested multiple regression models and determined that geographically-weighted regression analysis produced the most accurate result to accommodate non-stationary coefficient behavior, demonstrating that geographic variables are critical to understanding and predicting the phenomenon of terrorism. This dissertation presents a flexible prototypical model that can be refined and applied to other regions to inform stakeholders such as policy-makers and law enforcement in their efforts to improve national security and enhance quality-of-life.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel route to prepare highly active and stable N2O decomposition catalysts is presented, based on Fe-exchanged beta zeolite. The procedure consists of liquid phase Fe(III) exchange at low pH. By varying the pH systematically from 3.5 to 0, using nitric acid during each Fe(III)-exchange procedure, the degree of dealumination was controlled, verified by ICP and NMR. Dealumination changes the presence of neighbouring octahedral Al sites of the Fe sites, improving the performance for this reaction. The so-obtained catalysts exhibit a remarkable enhancement in activity, for an optimal pH of 1. Further optimization by increasing the Fe content is possible. The optimal formulation showed good conversion levels, comparable to a benchmark Fe-ferrierite catalyst. The catalyst stability under tail gas conditions containing NO, O2 and H2O was excellent, without any appreciable activity decay during 70 h time on stream. Based on characterisation and data analysis from ICP, single pulse excitation NMR, MQ MAS NMR, N2 physisorption, TPR(H2) analysis and apparent activation energies, the improved catalytic performance is attributed to an increased concentration of active sites. Temperature programmed reduction experiments reveal significant changes in the Fe(III) reducibility pattern with the presence of two reduction peaks; tentatively attributed to the interaction of the Fe-oxo species with electron withdrawing extraframework AlO6 species, causing a delayed reduction. A low-temperature peak is attributed to Fe-species exchanged on zeolitic AlO4 sites, which are partially charged by the presence of the neighbouring extraframework AlO6 sites. Improved mass transport phenomena due to acid leaching is ruled out. The increased activity is rationalized by an active site model, whose concentration increases by selectively washing out the distorted extraframework AlO6 species under acidic (optimal) conditions, liberating active Fe species.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hexaphenylbiadamantane-based microporous organic polymers (MOPs) were successfully synthesized by Suzuki coupling under mild conditions. The obtained MOPs show high surface area (891 m2 g−1), ultra-high thermal (less than 40% mass loss at temperatures up to 1000 °C) and chemical (no apparent decomposition in organic solvents for more than 7 days) stability, gas (H2, CO2, CH4) capture capabilities and vapor (benzene, hexane) adsorption. These combined abilities render the synthesized MOPs an attractive candidate as thermo-chemically stable adsorbents for practical use in gas storage and pollutant vapor adsorption.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Process systems design, operation and synthesis problems under uncertainty can readily be formulated as two-stage stochastic mixed-integer linear and nonlinear (nonconvex) programming (MILP and MINLP) problems. These problems, with a scenario based formulation, lead to large-scale MILPs/MINLPs that are well structured. The first part of the thesis proposes a new finitely convergent cross decomposition method (CD), where Benders decomposition (BD) and Dantzig-Wolfe decomposition (DWD) are combined in a unified framework to improve the solution of scenario based two-stage stochastic MILPs. This method alternates between DWD iterations and BD iterations, where DWD restricted master problems and BD primal problems yield a sequence of upper bounds, and BD relaxed master problems yield a sequence of lower bounds. A variant of CD, which includes multiple columns per iteration of DW restricted master problem and multiple cuts per iteration of BD relaxed master problem, called multicolumn-multicut CD is then developed to improve solution time. Finally, an extended cross decomposition method (ECD) for solving two-stage stochastic programs with risk constraints is proposed. In this approach, a CD approach at the first level and DWD at a second level is used to solve the original problem to optimality. ECD has a computational advantage over a bilevel decomposition strategy or solving the monolith problem using an MILP solver. The second part of the thesis develops a joint decomposition approach combining Lagrangian decomposition (LD) and generalized Benders decomposition (GBD), to efficiently solve stochastic mixed-integer nonlinear nonconvex programming problems to global optimality, without the need for explicit branch and bound search. In this approach, LD subproblems and GBD subproblems are systematically solved in a single framework. The relaxed master problem obtained from the reformulation of the original problem, is solved only when necessary. A convexification of the relaxed master problem and a domain reduction procedure are integrated into the decomposition framework to improve solution efficiency. Using case studies taken from renewable resource and fossil-fuel based application in process systems engineering, it can be seen that these novel decomposition approaches have significant benefit over classical decomposition methods and state-of-the-art MILP/MINLP global optimization solvers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Security defects are common in large software systems because of their size and complexity. Although efficient development processes, testing, and maintenance policies are applied to software systems, there are still a large number of vulnerabilities that can remain, despite these measures. Some vulnerabilities stay in a system from one release to the next one because they cannot be easily reproduced through testing. These vulnerabilities endanger the security of the systems. We propose vulnerability classification and prediction frameworks based on vulnerability reproducibility. The frameworks are effective to identify the types and locations of vulnerabilities in the earlier stage, and improve the security of software in the next versions (referred to as releases). We expand an existing concept of software bug classification to vulnerability classification (easily reproducible and hard to reproduce) to develop a classification framework for differentiating between these vulnerabilities based on code fixes and textual reports. We then investigate the potential correlations between the vulnerability categories and the classical software metrics and some other runtime environmental factors of reproducibility to develop a vulnerability prediction framework. The classification and prediction frameworks help developers adopt corresponding mitigation or elimination actions and develop appropriate test cases. Also, the vulnerability prediction framework is of great help for security experts focus their effort on the top-ranked vulnerability-prone files. As a result, the frameworks decrease the number of attacks that exploit security vulnerabilities in the next versions of the software. To build the classification and prediction frameworks, different machine learning techniques (C4.5 Decision Tree, Random Forest, Logistic Regression, and Naive Bayes) are employed. The effectiveness of the proposed frameworks is assessed based on collected software security defects of Mozilla Firefox.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study aimed to investigate the effects of sex and deprivation on participation in a population-based faecal immunochemical test (FIT) colorectal cancer screening programme. The study population included 9785 individuals invited to participate in two rounds of a population-based biennial FIT-based screening programme, in a relatively deprived area of Dublin, Ireland. Explanatory variables included in the analysis were sex, deprivation category of area of residence and age (at end of screening). The primary outcome variable modelled was participation status in both rounds combined (with “participation” defined as having taken part in either or both rounds of screening). Poisson regression with a log link and robust error variance was used to estimate relative risks (RR) for participation. As a sensitivity analysis, data were stratified by screening round. In both the univariable and multivariable models deprivation was strongly associated with participation. Increasing affluence was associated with higher participation; participation was 26% higher in people resident in the most affluent compared to the most deprived areas (multivariable RR = 1.26: 95% CI 1.21–1.30). Participation was significantly lower in males (multivariable RR = 0.96: 95%CI 0.95–0.97) and generally increased with increasing age (trend per age group, multivariable RR = 1.02: 95%CI, 1.01–1.02). No significant interactions between the explanatory variables were found. The effects of deprivation and sex were similar by screening round. Deprivation and male gender are independently associated with lower uptake of population-based FIT colorectal cancer screening, even in a relatively deprived setting. Development of evidence-based interventions to increase uptake in these disadvantaged groups is urgently required.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Logistic regression is a statistical tool widely used for predicting species’ potential distributions starting from presence/absence data and a set of independent variables. However, logistic regression equations compute probability values based not only on the values of the predictor variables but also on the relative proportion of presences and absences in the dataset, which does not adequately describe the environmental favourability for or against species presence. A few strategies have been used to circumvent this, but they usually imply an alteration of the original data or the discarding of potentially valuable information. We propose a way to obtain from logistic regression an environmental favourability function whose results are not affected by an uneven proportion of presences and absences. We tested the method on the distribution of virtual species in an imaginary territory. The favourability models yielded similar values regardless of the variation in the presence/absence ratio. We also illustrate with the example of the Pyrenean desman’s (Galemys pyrenaicus) distribution in Spain. The favourability model yielded more realistic potential distribution maps than the logistic regression model. Favourability values can be regarded as the degree of membership of the fuzzy set of sites whose environmental conditions are favourable to the species, which enables applying the rules of fuzzy logic to distribution modelling. They also allow for direct comparisons between models for species with different presence/absence ratios in the study area. This makes themmore useful to estimate the conservation value of areas, to design ecological corridors, or to select appropriate areas for species reintroductions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a process for the classifi cation of new residential electricity customers. The current state of the art is extended by using a combination of smart metering and survey data and by using model-based feature selection for the classifi cation task. Firstly, the normalized representative consumption profi les of the population are derived through the clustering of data from households. Secondly, new customers are classifi ed using survey data and a limited amount of smart metering data. Thirdly, regression analysis and model-based feature selection results explain the importance of the variables and which are the drivers of diff erent consumption profi les, enabling the extraction of appropriate models. The results of a case study show that the use of survey data signi ficantly increases accuracy of the classifi cation task (up to 20%). Considering four consumption groups, more than half of the customers are correctly classifi ed with only one week of metering data, with more weeks the accuracy is signifi cantly improved. The use of model-based feature selection resulted in the use of a signifi cantly lower number of features allowing an easy interpretation of the derived models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Noise is constant presence in measurements. Its origin is related to the microscopic properties of matter. Since the seminal work of Brown in 1828, the study of stochastic processes has gained an increasing interest with the development of new mathematical and analytical tools. In the last decades, the central role that noise plays in chemical and physiological processes has become recognized. The dual role of noise as nuisance/resource pushes towards the development of new decomposition techniques that divide a signal into its deterministic and stochastic components. In this thesis I show how methods based on Singular Spectrum Analysis have the right properties to fulfil the previously mentioned requirement. During my work I applied SSA to different signals of interest in chemistry: I developed a novel iterative procedure for the denoising of powder X-ray diffractograms; I “denoised” bi-dimensional images from experiments of electrochemiluminescence imaging of micro-beads obtaining new insight on ECL mechanism. I also used Principal Component Analysis to investigate the relationship between brain electrophysiological signals and voice emission.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis, the viability of the Dynamic Mode Decomposition (DMD) as a technique to analyze and model complex dynamic real-world systems is presented. This method derives, directly from data, computationally efficient reduced-order models (ROMs) which can replace too onerous or unavailable high-fidelity physics-based models. Optimizations and extensions to the standard implementation of the methodology are proposed, investigating diverse case studies related to the decoding of complex flow phenomena. The flexibility of this data-driven technique allows its application to high-fidelity fluid dynamics simulations, as well as time series of real systems observations. The resulting ROMs are tested against two tasks: (i) reduction of the storage requirements of high-fidelity simulations or observations; (ii) interpolation and extrapolation of missing data. The capabilities of DMD can also be exploited to alleviate the cost of onerous studies that require many simulations, such as uncertainty quantification analysis, especially when dealing with complex high-dimensional systems. In this context, a novel approach to address parameter variability issues when modeling systems with space and time-variant response is proposed. Specifically, DMD is merged with another model-reduction technique, namely the Polynomial Chaos Expansion, for uncertainty quantification purposes. Useful guidelines for DMD deployment result from the study, together with the demonstration of its potential to ease diagnosis and scenario analysis when complex flow processes are involved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The decomposition of Feynman integrals into a basis of independent master integrals is an essential ingredient of high-precision theoretical predictions, that often represents a major bottleneck when processes with a high number of loops and legs are involved. In this thesis we present a new algorithm for the decomposition of Feynman integrals into master integrals with the formalism of intersection theory. Intersection theory is a novel approach that allows to decompose Feynman integrals into master integrals via projections, based on a scalar product between Feynman integrals called intersection number. We propose a new purely rational algorithm for the calculation of intersection numbers of differential $n-$forms that avoids the presence of algebraic extensions. We show how expansions around non-rational poles, which are a bottleneck of existing algorithms for intersection numbers, can be avoided by performing an expansion in series around a rational polynomial irreducible over $\mathbb{Q}$, that we refer to as $p(z)-$adic expansion. The algorithm we developed has been implemented and tested on several diagrams, both at one and two loops.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tsunamis are rare events. However, their impact can be devastating and it may extend to large geographical areas. For low-probability high-impact events like tsunamis, it is crucial to implement all possible actions to mitigate the risk. The tsunami hazard assessment is the result of a scientific process that integrates traditional geological methods, numerical modelling and the analysis of tsunami sources and historical records. For this reason, analysing past events and understanding how they interacted with the land is the only way to inform tsunami source and propagation models, and quantitatively test forecast models like hazard analyses. The primary objective of this thesis is to establish an explicit relationship between the macroscopic intensity, derived from historical descriptions, and the quantitative physical parameters measuring tsunami waves. This is done first by defining an approximate estimation method based on a simplified 1D physical onshore propagation model to convert the available observations into one reference physical metric. Wave height at the coast was chosen as the reference due to its stability and independence of inland effects. This method was then implemented for a set of well-known past events to build a homogeneous dataset with both macroseismic intensity and wave height. By performing an orthogonal regression, a direct and invertible empirical relationship could be established between the two parameters, accounting for their relevant uncertainties. The target relationship is extensively tested and finally applied to the Italian Tsunami Effect Database (ITED), providing a homogeneous estimation of the wave height for all existing tsunami observations in Italy. This provides the opportunity for meaningful comparison for models and simulations, as well as quantitatively testing tsunami hazard models for the Italian coasts and informing tsunami risk management initiatives.