818 resultados para fuzzy rule base models
Resumo:
Evidence-based practice as it applies to the Library and Information (LIS) sector and in particular teacher librarians is the focus of this research investigation. The context for this research is Australian school libraries and teacher librarians. This is a research in progress and the report here will include some very early findings and lessons learned from the initial pilot study. The contributions of this research will be in developing a framework for the library and information sector with a particular application for teacher librarians. Providing meaningful evidence of work practices that demonstrate contributions to the schools goals and mission statements in conjunction with contributions to student academic, social and cultural achievements are crucial for the future of the teacher librarian.
Resumo:
There is a severe tendency in cyberlaw theory to delegitimize state intervention in the governance of virtual communities. Much of the existing theory makes one of two fundamental flawed assumptions: that communities will always be best governed without the intervention of the state; or that the territorial state can best encourage the development of communities by creating enforceable property rights and allowing the market to resolve any disputes. These assumptions do not ascribe sufficient weight to the value-laden support that the territorial state always provides to private governance regimes, the inefficiencies that will tend to limit the development utopian communities, and the continued role of the territorial state in limiting autonomy in accordance with communal values. In order to overcome these deterministic assumptions, this article provides a framework based upon the values of the rule of law through which to conceptualise the legitimacy of the private exercise of power in virtual communities. The rule of law provides a constitutional discourse that assists in considering appropriate limits on the exercise of private power. I argue that the private contractual framework that is used to govern relations in virtual communities ought to be informed by the values of the rule of law in order to more appropriately address the governance tensions that permeate these spaces. These values suggest three main limits to the exercise of private power: that governance is limited by community rules and that the scope of autonomy is limited by the substantive values of the territorial state; that private contractual rules should be general, equal, and certain; and that, most importantly, internal norms be predicated upon the consent of participants.
Resumo:
Power system stabilizers (PSS) work well at the particular network configuration and steady state conditions for which they were designed. Once conditions change, their performance degrades. This can be overcome by an intelligent nonlinear PSS based on fuzzy logic. Such a fuzzy logic power system stabilizer (FLPSS) is developed, using speed and power deviation as inputs, and provides an auxiliary signal for the excitation system of a synchronous motor in a multimachine power system environment. The FLPSS's effect on the system damping is then compared with a conventional power system stabilizer's (CPSS) effect on the system. The results demonstrate an improved system performance with the FLPSS and also that the FLPSS is robust
Resumo:
This paper reports on the study of passenger experiences and how passengers interact with services, technology and processes at an airport. As part of our research, we have followed people through the airport from check-in to security and from security to boarding. Data was collected by approaching passengers in the departures concourse of the airport and asking for their consent to be videotaped. Data was collected and coded and the analysis focused on both discretionary and process related passenger activities. Our findings show the interdependence between activities and passenger experiences. Within all activities, passengers interact with processes, domain dependent technology, services, personnel and artifacts. These levels of interaction impact on passenger experiences and are interdependent. The emerging taxonomy of activities consists of (i) ownership related activities, (ii) group activities, (iii) individual activities (such as activities at the domain interfaces) and (iv) concurrent activities. This classification is contributing to the development of descriptive models of passenger experiences and how these activities affect the facilitation and design of future airports.
Resumo:
In an open railway access market, the Infrastructure Provider (IP), upon the receipts of service bids from the Train Service Providers (TSPs), assigns track access rights according to its own business objectives and the merits of the bids; and produces the train service timetable through negotiations. In practice, IP chooses to negotiate with the TSPs one by one in such a sequence that IP optimizes its objectives. The TSP bids are usually very complicated, containing a large number of parameters in different natures. It is a difficult task even for an expert to give a priority sequence for negotiations from the contents of the bids. This study proposes the application of fuzzy ranking method to compare and prioritize the TSP bids in order to produce a negotiation sequence. The results of this study allow investigations on the behaviors of the stakeholders in bid preparation and negotiation, as well as evaluation of service quality in the open railway market.
Resumo:
Advances in safety research—trying to improve the collective understanding of motor vehicle crash causation—rests upon the pursuit of numerous lines of inquiry. The research community has focused on analytical methods development (negative binomial specifications, simultaneous equations, etc.), on better experimental designs (before-after studies, comparison sites, etc.), on improving exposure measures, and on model specification improvements (additive terms, non-linear relations, etc.). One might think of different lines of inquiry in terms of ‘low lying fruit’—areas of inquiry that might provide significant improvements in understanding crash causation. It is the contention of this research that omitted variable bias caused by the exclusion of important variables is an important line of inquiry in safety research. In particular, spatially related variables are often difficult to collect and omitted from crash models—but offer significant ability to better understand contributing factors to crashes. This study—believed to represent a unique contribution to the safety literature—develops and examines the role of a sizeable set of spatial variables in intersection crash occurrence. In addition to commonly considered traffic and geometric variables, examined spatial factors include local influences of weather, sun glare, proximity to drinking establishments, and proximity to schools. The results indicate that inclusion of these factors results in significant improvement in model explanatory power, and the results also generally agree with expectation. The research illuminates the importance of spatial variables in safety research and also the negative consequences of their omissions.
Resumo:
Crash prediction models are used for a variety of purposes including forecasting the expected future performance of various transportation system segments with similar traits. The influence of intersection features on safety have been examined extensively because intersections experience a relatively large proportion of motor vehicle conflicts and crashes compared to other segments in the transportation system. The effects of left-turn lanes at intersections in particular have seen mixed results in the literature. Some researchers have found that left-turn lanes are beneficial to safety while others have reported detrimental effects on safety. This inconsistency is not surprising given that the installation of left-turn lanes is often endogenous, that is, influenced by crash counts and/or traffic volumes. Endogeneity creates problems in econometric and statistical models and is likely to account for the inconsistencies reported in the literature. This paper reports on a limited-information maximum likelihood (LIML) estimation approach to compensate for endogeneity between left-turn lane presence and angle crashes. The effects of endogeneity are mitigated using the approach, revealing the unbiased effect of left-turn lanes on crash frequency for a dataset of Georgia intersections. The research shows that without accounting for endogeneity, left-turn lanes ‘appear’ to contribute to crashes; however, when endogeneity is accounted for in the model, left-turn lanes reduce angle crash frequencies as expected by engineering judgment. Other endogenous variables may lurk in crash models as well, suggesting that the method may be used to correct simultaneity problems with other variables and in other transportation modeling contexts.
Resumo:
Statistical modeling of traffic crashes has been of interest to researchers for decades. Over the most recent decade many crash models have accounted for extra-variation in crash counts—variation over and above that accounted for by the Poisson density. The extra-variation – or dispersion – is theorized to capture unaccounted for variation in crashes across sites. The majority of studies have assumed fixed dispersion parameters in over-dispersed crash models—tantamount to assuming that unaccounted for variation is proportional to the expected crash count. Miaou and Lord [Miaou, S.P., Lord, D., 2003. Modeling traffic crash-flow relationships for intersections: dispersion parameter, functional form, and Bayes versus empirical Bayes methods. Transport. Res. Rec. 1840, 31–40] challenged the fixed dispersion parameter assumption, and examined various dispersion parameter relationships when modeling urban signalized intersection accidents in Toronto. They suggested that further work is needed to determine the appropriateness of the findings for rural as well as other intersection types, to corroborate their findings, and to explore alternative dispersion functions. This study builds upon the work of Miaou and Lord, with exploration of additional dispersion functions, the use of an independent data set, and presents an opportunity to corroborate their findings. Data from Georgia are used in this study. A Bayesian modeling approach with non-informative priors is adopted, using sampling-based estimation via Markov Chain Monte Carlo (MCMC) and the Gibbs sampler. A total of eight model specifications were developed; four of them employed traffic flows as explanatory factors in mean structure while the remainder of them included geometric factors in addition to major and minor road traffic flows. The models were compared and contrasted using the significance of coefficients, standard deviance, chi-square goodness-of-fit, and deviance information criteria (DIC) statistics. The findings indicate that the modeling of the dispersion parameter, which essentially explains the extra-variance structure, depends greatly on how the mean structure is modeled. In the presence of a well-defined mean function, the extra-variance structure generally becomes insignificant, i.e. the variance structure is a simple function of the mean. It appears that extra-variation is a function of covariates when the mean structure (expected crash count) is poorly specified and suffers from omitted variables. In contrast, when sufficient explanatory variables are used to model the mean (expected crash count), extra-Poisson variation is not significantly related to these variables. If these results are generalizable, they suggest that model specification may be improved by testing extra-variation functions for significance. They also suggest that known influences of expected crash counts are likely to be different than factors that might help to explain unaccounted for variation in crashes across sites
Resumo:
There has been considerable research conducted over the last 20 years focused on predicting motor vehicle crashes on transportation facilities. The range of statistical models commonly applied includes binomial, Poisson, Poisson-gamma (or negative binomial), zero-inflated Poisson and negative binomial models (ZIP and ZINB), and multinomial probability models. Given the range of possible modeling approaches and the host of assumptions with each modeling approach, making an intelligent choice for modeling motor vehicle crash data is difficult. There is little discussion in the literature comparing different statistical modeling approaches, identifying which statistical models are most appropriate for modeling crash data, and providing a strong justification from basic crash principles. In the recent literature, it has been suggested that the motor vehicle crash process can successfully be modeled by assuming a dual-state data-generating process, which implies that entities (e.g., intersections, road segments, pedestrian crossings, etc.) exist in one of two states—perfectly safe and unsafe. As a result, the ZIP and ZINB are two models that have been applied to account for the preponderance of “excess” zeros frequently observed in crash count data. The objective of this study is to provide defensible guidance on how to appropriate model crash data. We first examine the motor vehicle crash process using theoretical principles and a basic understanding of the crash process. It is shown that the fundamental crash process follows a Bernoulli trial with unequal probability of independent events, also known as Poisson trials. We examine the evolution of statistical models as they apply to the motor vehicle crash process, and indicate how well they statistically approximate the crash process. We also present the theory behind dual-state process count models, and note why they have become popular for modeling crash data. A simulation experiment is then conducted to demonstrate how crash data give rise to “excess” zeros frequently observed in crash data. It is shown that the Poisson and other mixed probabilistic structures are approximations assumed for modeling the motor vehicle crash process. Furthermore, it is demonstrated that under certain (fairly common) circumstances excess zeros are observed—and that these circumstances arise from low exposure and/or inappropriate selection of time/space scales and not an underlying dual state process. In conclusion, carefully selecting the time/space scales for analysis, including an improved set of explanatory variables and/or unobserved heterogeneity effects in count regression models, or applying small-area statistical methods (observations with low exposure) represent the most defensible modeling approaches for datasets with a preponderance of zeros
Resumo:
It is important to examine the nature of the relationships between roadway, environmental, and traffic factors and motor vehicle crashes, with the aim to improve the collective understanding of causal mechanisms involved in crashes and to better predict their occurrence. Statistical models of motor vehicle crashes are one path of inquiry often used to gain these initial insights. Recent efforts have focused on the estimation of negative binomial and Poisson regression models (and related deviants) due to their relatively good fit to crash data. Of course analysts constantly seek methods that offer greater consistency with the data generating mechanism (motor vehicle crashes in this case), provide better statistical fit, and provide insight into data structure that was previously unavailable. One such opportunity exists with some types of crash data, in particular crash-level data that are collected across roadway segments, intersections, etc. It is argued in this paper that some crash data possess hierarchical structure that has not routinely been exploited. This paper describes the application of binomial multilevel models of crash types using 548 motor vehicle crashes collected from 91 two-lane rural intersections in the state of Georgia. Crash prediction models are estimated for angle, rear-end, and sideswipe (both same direction and opposite direction) crashes. The contributions of the paper are the realization of hierarchical data structure and the application of a theoretically appealing and suitable analysis approach for multilevel data, yielding insights into intersection-related crashes by crash type.
Resumo:
A study was done to develop macrolevel crash prediction models that can be used to understand and identify effective countermeasures for improving signalized highway intersections and multilane stop-controlled highway intersections in rural areas. Poisson and negative binomial regression models were fit to intersection crash data from Georgia, California, and Michigan. To assess the suitability of the models, several goodness-of-fit measures were computed. The statistical models were then used to shed light on the relationships between crash occurrence and traffic and geometric features of the rural signalized intersections. The results revealed that traffic flow variables significantly affected the overall safety performance of the intersections regardless of intersection type and that the geometric features of intersections varied across intersection type and also influenced crash type.
Resumo:
The intent of this note is to succinctly articulate additional points that were not provided in the original paper (Lord et al., 2005) and to help clarify a collective reluctance to adopt zero-inflated (ZI) models for modeling highway safety data. A dialogue on this important issue, just one of many important safety modeling issues, is healthy discourse on the path towards improved safety modeling. This note first provides a summary of prior findings and conclusions of the original paper. It then presents two critical and relevant issues: the maximizing statistical fit fallacy and logic problems with the ZI model in highway safety modeling. Finally, we provide brief conclusions.
Resumo:
Purpose: To investigate the influence of convergence on axial length and corneal topography in young adult subjects.---------- Methods: Fifteen emmetropic young adult subjects with normal binocular vision had axial length and corneal topography measured immediately before and after a 15-min period of base out (BO) prismatic spectacle lens wear. Two different magnitude prismatic spectacles were worn in turn (8 [DELTA] BO and 16 [DELTA] BO), and for both tasks, distance fixation was maintained for the duration of lens wear. Eight subjects returned on a separate day for further testing and had axial length measured before, during, and immediately after a 15-min convergence task.---------- Results: No significant change was found to occur in axial length either during or after the sustained convergence tasks (p > 0.6). Some small but significant changes in corneal topography were found to occur after sustained convergence. The most significant corneal change was observed after the 16 [DELTA] BO prism wear. The corneal refractive power spherocylinder power vector J0 was found to change by a small (mean change of 0.03 D after the 16 [DELTA] BO task) but statistically significant (p = 0.03) amount as a result of the convergence task (indicative of a reduction in with-the-rule corneal astigmatism after convergence). Corneal axial power was found to exhibit a significant flattening in superior regions. Conclusions: Axial length appears largely unchanged by a period of sustained convergence. However, small but significant changes occur in the topography of the cornea after convergence.
Resumo:
The dominant economic paradigm currently guiding industry policy making in Australia and much of the rest of the world is the neoclassical approach. Although neoclassical theories acknowledge that growth is driven by innovation, such innovation is exogenous to their standard models and hence often not explored. Instead the focus is on the allocation of scarce resources, where innovation is perceived as an external shock to the system. Indeed, analysis of innovation is largely undertaken by other disciplines, such as evolutionary economics and institutional economics. As more has become known about innovation processes, linear models, based on research and development or market demand, have been replaced by more complex interactive models which emphasise the existence of feedback loops between the actors and activities involved in the commercialisation of ideas (Manley 2003). Currently dominant among these approaches is the national or sectoral innovation system model (Breschi and Malerba 2000; Nelson 1993), which is based on the notion of increasingly open innovation systems (Chesbrough, Vanhaverbeke, and West 2008). This chapter reports on the ‘BRITE Survey’ funded by the Cooperative Research Centre for Construction Innovation which investigated the open sectoral innovation system operating in the Australian construction industry. The BRITE Survey was undertaken in 2004 and it is the largest construction innovation survey ever conducted in Australia. The results reported here give an indication of how construction innovation processes operate, as an example that should be of interest to international audiences interested in construction economics. The questionnaire was based on a broad range of indicators recommended in the OECD’s Community Innovation Survey guidelines (OECD/Eurostat 2005). Although the ABS has recently begun to undertake regular innovation surveys that include the construction industry (2006), they employ a very narrow definition of the industry and only collect very basic data compared to that provided by the BRITE Survey, which is presented in this chapter. The term ‘innovation’ is defined here as a new or significantly improved technology or organisational practice, based broadly on OECD definitions (OECD/Eurostat 2005). Innovation may be technological or organisational in nature and it may be new to the world, or just new to the industry or the business concerned. The definition thus includes the simple adoption of existing technological and organisational advancements. The survey collected information about respondents’ perceptions of innovation determinants in the industry, comprising various aspects of business strategy and business environment. It builds on a pilot innovation survey undertaken by PricewaterhouseCoopers (PWC) for the Australian Construction Industry Forum on behalf of the Australian Commonwealth Department of Industry Tourism and Resources, in 2001 (PWC 2002). The survey responds to an identified need within the Australian construction industry to have accurate and timely innovation data upon which to base effective management strategies and public policies (Focus Group 2004).
Resumo:
We advance the proposition that dynamic stochastic general equilibrium (DSGE) models should not only be estimated and evaluated with full information methods. These require that the complete system of equations be specified properly. Some limited information analysis, which focuses upon specific equations, is therefore likely to be a useful complement to full system analysis. Two major problems occur when implementing limited information methods. These are the presence of forward-looking expectations in the system as well as unobservable non-stationary variables. We present methods for dealing with both of these difficulties, and illustrate the interaction between full and limited information methods using a well-known model.