959 resultados para Classical tradicion
Resumo:
A quantitative, quasi-experimental study of the effectiveness of computer-based scientific visualizations for concept learning on the part of Year 11 physics students (n=80) was conducted in six Queensland high school classrooms. Students’ gender and academic ability were also considered as factors in relation to the effectiveness of teaching with visualizations. Learning with visualizations was found to be equally effective as learning without them for all students, with no statistically significant difference in outcomes being observed for the group as a whole or on the academic ability dimension. Male students were found to learn significantly better with visualizations than without, while no such effect was observed for female students. This may give rise to some concern for the equity issues raised by introducing visualizations. Given that other research shows that students enjoy learning with visualizations and that their engagement with learning is enhanced, the finding that the learning outcomes are the same as for teaching without visualizations supports teachers’ use of visualizations.
Resumo:
This article presents a methodology that integrates cumulative plots with probe vehicle data for estimation of travel time statistics (average, quartile) on urban networks. The integration reduces relative deviation among the cumulative plots so that the classical analytical procedure of defining the area between the plots as the total travel time can be applied. For quartile estimation, a slicing technique is proposed. The methodology is validated with real data from Lucerne, Switzerland and it is concluded that the travel time estimates from the proposed methodology are statistically equivalent to the observed values.
Resumo:
Divergence from a random baseline is a technique for the evaluation of document clustering. It ensures cluster quality measures are performing work that prevents ineffective clusterings from giving high scores to clusterings that provide no useful result. These concepts are defined and analysed using intrinsic and extrinsic approaches to the evaluation of document cluster quality. This includes the classical clusters to categories approach and a novel approach that uses ad hoc information retrieval. The divergence from a random baseline approach is able to differentiate ineffective clusterings encountered in the INEX XML Mining track. It also appears to perform a normalisation similar to the Normalised Mutual Information (NMI) measure but it can be applied to any measure of cluster quality. When it is applied to the intrinsic measure of distortion as measured by RMSE, subtraction from a random baseline provides a clear optimum that is not apparent otherwise. This approach can be applied to any clustering evaluation. This paper describes its use in the context of document clustering evaluation.
Resumo:
1. The phylogeography of freshwater taxa is often integrally linked with landscape changes such as drainage re-alignments that may present the only avenue for historical dispersal for these taxa. Classical models of gene flow do not account for landscape changes and so are of little use in predicting phylogeography in geologically young freshwater landscapes. When the history of drainage formation is unknown, phylogeographical predictions can be based on current freshwater landscape structure, proposed historical drainage geomorphology, or from phylogeographical patterns of co-distributed taxa. 2. This study describes the population structure of a sedentary freshwater fish, the chevron snakehead (Channa striata), across two river drainages on the Indochinese Peninsula. The phylogeographical pattern recovered for C. striata was tested against seven hypotheses based on contemporary landscape structure, proposed history and phylogeographical patterns of codistributed taxa. 3. Consistent with the species ecology, analysis of mitochondrial and microsatellite loci revealed very high differentiation among all sampled sites. A strong signature of historical population subdivision was also revealed within the contemporary Mekong River Basin (MRB). Of the seven phylogeographical hypotheses tested, patterns of co-distributed taxa proved to be the most adequate for describing the phylogeography of C. striata. 4. Results shed new light on SE Asian drainage evolution, indicating that the Middle MRB probably evolved via amalgamation of at least three historically independent drainage sections and in particular that the Mekong River section centred around the northern Khorat Plateau in NE Thailand was probably isolated from the greater Mekong for an extensive period of evolutionary time. In contrast, C. striata populations in the Lower MRB do not show a phylogeographical signature of evolution in historically isolated drainage lines, suggesting drainage amalgamation has been less important for river landscape formation in this region.
Resumo:
The term “vagueness” describes a property of natural concepts, which normally have fuzzy boundaries, admit borderline cases, and are susceptible to Zeno’s sorites paradox. We will discuss the psychology of vagueness, especially experiments investigating the judgment of borderline cases and contradictions. In the theoretical part, we will propose a probabilistic model that describes the quantitative characteristics of the experimental finding and extends Alxatib’s and Pelletier’s (2011) theoretical analysis. The model is based on a Hopfield network for predicting truth values. Powerful as this classical perspective is, we show that it falls short of providing an adequate coverage of the relevant empirical results. In the final part, we will argue that a substan- tial modification of the analysis put forward by Alxatib and Pelletier and its probabilistic pendant is needed. The proposed modification replaces the standard notion of probabilities by quantum probabilities. The crucial phenomenon of borderline contradictions can be explained then as a quantum interference phenomenon.
Resumo:
The identification of the primary drivers of stock returns has been of great interest to both financial practitioners and academics alike for many decades. Influenced by classical financial theories such as the CAPM (Sharp, 1964; Lintner, 1965) and APT (Ross, 1976), a linear relationship is conventionally assumed between company characteristics as derived from their financial accounts and forward returns. Whilst this assumption may be a fair approximation to the underlying structural relationship, it is often adopted for the purpose of convenience. It is actually quite rare that the assumptions of distributional normality and a linear relationship are explicitly assessed in advance even though this information would help to inform the appropriate choice of modelling technique. Non-linear models have nevertheless been applied successfully to the task of stock selection in the past (Sorensen et al, 2000). However, their take-up by the investment community has been limited despite the fact that researchers in other fields have found them to be a useful way to express knowledge and aid decision-making...
Resumo:
A new dualscale modelling approach is presented for simulating the drying of a wet hygroscopic porous material that couples the porous medium (macroscale) with the underlying pore structure (microscale). The proposed model is applied to the convective drying of wood at low temperatures and is valid in the so-called hygroscopic range, where hygroscopically held liquid water is present in the solid phase and water exits only as vapour in the pores. Coupling between scales is achieved by imposing the macroscopic gradients of moisture content and temperature on the microscopic field using suitably-defined periodic boundary conditions, which allows the macroscopic mass and thermal fluxes to be defined as averages of the microscopic fluxes over the unit cell. This novel formulation accounts for the intricate coupling of heat and mass transfer at the microscopic scale but reduces to a classical homogenisation approach if a linear relationship is assumed between the microscopic gradient and flux. Simulation results for a sample of spruce wood highlight the potential and flexibility of the new dual-scale approach. In particular, for a given unit cell configuration it is not necessary to propose the form of the macroscopic fluxes prior to the simulations because these are determined as a direct result of the dual-scale formulation.
Resumo:
Dual-mode vibration of nanowires has been reported experimentally through actuation of the nanowire at its resonance frequency, which is expected to open up a variety of new modalities for the NEMS that could operate in the nonlinear regime. In the present work, we utilize large scale molecular dynamics simulations to investigate the dual-mode vibration of <110> Ag nanowires with triangular, rhombic and truncated rhombic cross-sections. By incorporating the generalized Young-Laplace equation into Euler-Bernoulli beam theory, the influence of surface effects on the dual-mode vibration is studied. Due to the different lattice spacing in principal axes of inertia of the {110} atomic layers, the NW is also modeled as a discrete system to reveal the influence from such specific atomic arrangement. It is found that the <110> Ag NW will under a dual-mode vibration if the actuation direction is deviated from the two principal axes of inertia. The predictions of the two first mode natural frequencies by the classical beam model appear underestimated comparing with the MD results, which are found to be enhanced by the discrete model. Particularly, the predictions by the beam theory with the contribution of surface effects are uniformly larger than the classical beam model, which exhibit better agreement with MD results for larger cross-sectional size. However, for ultrathin NWs, current consideration of surface effects is still experiencing certain inaccuracy. In all, for all different cross-sections, the inclusion of surface effects is found to reduce the difference between the two first mode natural frequencies. This trend is observed consistent with MD results. This study provides a first comprehensive investigation on the dual-mode vibration of <110> oriented Ag NWs, which is supposed to benefit the applications of NWs that acting as a resonating beam.
Resumo:
Exceeding the speed limit and driving too fast for the conditions are regularly cited as significant contributing factors in traffic crashes, particularly fatal and serious injury crashes. Despite an extensive body of research highlighting the relationship between increased vehicle speeds and crash risk and severity, speeding remains a pervasive behaviour on Australian roads. The development of effective countermeasures designed to reduce the prevalence of speeding behaviour requires that this behaviour is well understood. The primary aim of this program of research was to develop a better understanding of the influence of drivers’ perceptions and attitudes toward police speed enforcement on speeding behaviour. Study 1 employed focus group discussions with 39 licensed drivers to explore the influence of perceptions relating to specific characteristics of speed enforcement policies and practices on drivers’ attitudes towards speed enforcement. Three primary factors were identified as being most influential: site selection; visibility; and automaticity (i.e., whether the enforcement approach is automated/camera-based or manually operated). Perceptions regarding these enforcement characteristics were found to influence attitudes regarding the perceived legitimacy and transparency of speed enforcement. Moreover, misperceptions regarding speed enforcement policies and practices appeared to also have a substantial impact on attitudes toward speed enforcement, typically in a negative direction. These findings have important implications for road safety given that prior research has suggested that the effectiveness of speed enforcement approaches may be reduced if efforts are perceived by drivers as being illegitimate, such that they do little to encourage voluntary compliance. Study 1 also examined the impact of speed enforcement approaches varying in the degree of visibility and automaticity on self-reported willingness to comply with speed limits. These discussions suggested that all of the examined speed enforcement approaches (see Section 1.5 for more details) generally showed potential to reduce vehicle speeds and encourage compliance with posted speed limits. Nonetheless, participant responses suggested a greater willingness to comply with approaches operated in a highly visible manner, irrespective of the corresponding level of automaticity of the approach. While less visible approaches were typically associated with poorer rates of driver acceptance (e.g., perceived as “sneaky” and “unfair”), participants reported that such approaches would likely encourage long-term and network-wide impacts on their own speeding behaviour, as a function of the increased unpredictability of operations and increased direct (specific deterrence) and vicarious (general deterrence) experiences with punishment. Participants in Study 1 suggested that automated approaches, particularly when operated in a highly visible manner, do little to encourage compliance with speed limits except in the immediate vicinity of the enforcement location. While speed cameras have been criticised on such grounds in the past, such approaches can still have substantial road safety benefits if implemented in high-risk settings. Moreover, site-learning effects associated with automated approaches can also be argued to be a beneficial by-product of enforcement, such that behavioural modifications are achieved even in the absence of actual enforcement. Conversely, manually operated approaches were reported to be associated with more network-wide impacts on behaviour. In addition, the reported acceptance of such methods was high, due to the increased swiftness of punishment, ability for additional illegal driving behaviours to be policed and the salutary influence associated with increased face-to-face contact with authority. Study 2 involved a quantitative survey conducted with 718 licensed Queensland drivers from metropolitan and regional areas. The survey sought to further examine the influence of the visibility and automaticity of operations on self-reported likelihood and duration of compliance. Overall, the results from Study 2 corroborated those of Study 1. All examined approaches were again found to encourage compliance with speed limits, such that all approaches could be considered to be “effective”. Nonetheless, significantly greater self-reported likelihood and duration of compliance was associated with visibly operated approaches, irrespective of the corresponding automaticity of the approach. In addition, the impact of automaticity was influenced by visibility; such that significantly greater self-reported likelihood of compliance was associated with manually operated approaches, but only when they are operated in a less visible fashion. Conversely, manually operated approaches were associated with significantly greater durations of self-reported compliance, but only when they are operated in a highly visible manner. Taken together, the findings from Studies 1 and 2 suggest that enforcement efforts, irrespective of their visibility or automaticity, generally encourage compliance with speed limits. However, the duration of these effects on behaviour upon removal of the enforcement efforts remains questionable and represents an area where current speed enforcement practices could possibly be improved. Overall, it appears that identifying the optimal mix of enforcement operations, implementing them at a sufficient intensity and increasing the unpredictability of enforcement efforts (e.g., greater use of less visible approaches, random scheduling) are critical elements of success. Hierarchical multiple regression analyses were also performed in Study 2 to investigate the punishment-related and attitudinal constructs that influence self-reported frequency of speeding behaviour. The research was based on the theoretical framework of expanded deterrence theory, augmented with three particular attitudinal constructs. Specifically, previous research examining the influence of attitudes on speeding behaviour has typically focussed on attitudes toward speeding behaviour in general only. This research sought to more comprehensively explore the influence of attitudes by also individually measuring and analysing attitudes toward speed enforcement and attitudes toward the appropriateness of speed limits on speeding behaviour. Consistent with previous research, a number of classical and expanded deterrence theory variables were found to significantly predict self-reported frequency of speeding behaviour. Significantly greater speeding behaviour was typically reported by those participants who perceived punishment associated with speeding to be less certain, who reported more frequent use of punishment avoidance strategies and who reported greater direct experiences with punishment. A number of interesting differences in the significant predictors among males and females, as well as younger and older drivers, were reported. Specifically, classical deterrence theory variables appeared most influential on the speeding behaviour of males and younger drivers, while expanded deterrence theory constructs appeared more influential for females. These findings have important implications for the development and implementation of speeding countermeasures. Of the attitudinal factors, significantly greater self-reported frequency of speeding behaviour was reported among participants who held more favourable attitudes toward speeding and who perceived speed limits to be set inappropriately low. Disappointingly, attitudes toward speed enforcement were found to have little influence on reported speeding behaviour, over and above the other deterrence theory and attitudinal constructs. Indeed, the relationship between attitudes toward speed enforcement and self-reported speeding behaviour was completely accounted for by attitudes toward speeding. Nonetheless, the complexity of attitudes toward speed enforcement are not yet fully understood and future research should more comprehensively explore the measurement of this construct. Finally, given the wealth of evidence (both in general and emerging from this program of research) highlighting the association between punishment avoidance and speeding behaviour, Study 2 also sought to investigate the factors that influence the self-reported propensity to use punishment avoidance strategies. A standard multiple regression analysis was conducted for exploratory purposes only. The results revealed that punishment-related and attitudinal factors significantly predicted approximately one fifth of the variance in the dependent variable. The perceived ability to avoid punishment, vicarious punishment experience, vicarious punishment avoidance and attitudes toward speeding were all significant predictors. Future research should examine these relationships more thoroughly and identify additional influential factors. In summary, the current program of research has a number of implications for road safety and speed enforcement policy and practice decision-making. The research highlights a number of potential avenues for the improvement of public education regarding enforcement efforts and provides a number of insights into punishment avoidance behaviours. In addition, the research adds strength to the argument that enforcement approaches should not only demonstrate effectiveness in achieving key road safety objectives, such as reduced vehicle speeds and associated crashes, but also strive to be transparent and legitimate, such that voluntary compliance is encouraged. A number of potential strategies are discussed (e.g., point-to-point speed cameras, intelligent speed adaptation. The correct mix and intensity of enforcement approaches appears critical for achieving optimum effectiveness from enforcement efforts, as well as enhancements in the unpredictability of operations and swiftness of punishment. Achievement of these goals should increase both the general and specific deterrent effects associated with enforcement through an increased perceived risk of detection and a more balanced exposure to punishment and punishment avoidance experiences.
Resumo:
Traffic safety studies demand more than what current micro-simulation models can provide as they presume that all drivers of motor vehicles exhibit safe behaviours. Several car-following models are used in various micro-simulation models. This research compares the mainstream car following models’ capabilities of emulating precise driver behaviour parameters such as headways and Time to Collisions. The comparison firstly illustrates which model is more robust in the metric reproduction. Secondly, the study conducted a series of sensitivity tests to further explore the behaviour of each model. Based on the outcome of these two steps exploration of the models, a modified structure and parameters adjustment for each car-following model is proposed to simulate more realistic vehicle movements, particularly headways and Time to Collision, below a certain critical threshold. NGSIM vehicle trajectory data is used to evaluate the modified models performance to assess critical safety events within traffic flow. The simulation tests outcomes indicate that the proposed modified models produce better frequency of critical Time to Collision than the generic models, while the improvement on the headway is not significant. The outcome of this paper facilitates traffic safety assessment using microscopic simulation.
Resumo:
Prostate cancer (CaP) is the second leading cause of cancer-related deaths in North American males and the most common newly diagnosed cancer in men world wide. Biomarkers are widely used for both early detection and prognostic tests for cancer. The current, commonly used biomarker for CaP is serum prostate specific antigen (PSA). However, the specificity of this biomarker is low as its serum level is not only increased in CaP but also in various other diseases, with age and even body mass index. Human body fluids provide an excellent resource for the discovery of biomarkers, with the advantage over tissue/biopsy samples of their ease of access, due to the less invasive nature of collection. However, their analysis presents challenges in terms of variability and validation. Blood and urine are two human body fluids commonly used for CaP research, but their proteomic analyses are limited both by the large dynamic range of protein abundance making detection of low abundance proteins difficult and in the case of urine, by the high salt concentration. To overcome these challenges, different techniques for removal of high abundance proteins and enrichment of low abundance proteins are used. Their applications and limitations are discussed in this review. A number of innovative proteomic techniques have improved detection of biomarkers. They include two dimensional differential gel electrophoresis (2D-DIGE), quantitative mass spectrometry (MS) and functional proteomic studies, i.e., investigating the association of post translational modifications (PTMs) such as phosphorylation, glycosylation and protein degradation. The recent development of quantitative MS techniques such as stable isotope labeling with amino acids in cell culture (SILAC), isobaric tags for relative and absolute quantitation (iTRAQ) and multiple reaction monitoring (MRM) have allowed proteomic researchers to quantitatively compare data from different samples. 2D-DIGE has greatly improved the statistical power of classical 2D gel analysis by introducing an internal control. This chapter aims to review novel CaP biomarkers as well as to discuss current trends in biomarker research from two angles: the source of biomarkers (particularly human body fluids such as blood and urine), and emerging proteomic approaches for biomarker research.
Resumo:
The authors present a qualitative and quantitative comparison of various similarity measures that form the kernel of common area-based stereo-matching systems. The authors compare classical difference and correlation measures as well as nonparametric measures based on the rank and census transforms for a number of outdoor images. For robotic applications, important considerations include robustness to image defects such as intensity variation and noise, the number of false matches, and computational complexity. In the absence of ground truth data, the authors compare the matching techniques based on the percentage of matches that pass the left-right consistency test. The authors also evaluate the discriminatory power of several match validity measures that are reported in the literature for eliminating false matches and for estimating match confidence. For guidance applications, it is essential to have and estimate of confidence in the three-dimensional points generated by stereo vision. Finally, a new validity measure, the rank constraint, is introduced that is capable of resolving ambiguous matches for rank transform-based matching.
Resumo:
The extraordinary event, for Deleuze, is the object becoming subject – not in the manner of an abstract formulation, such as the substitution of one ideational representation for another but, rather, in the introduction of a vast, new, impersonal plane of subjectivity, populated by object processes and physical phenomena that in Deleuze’s discovery will be shown to constitute their own subjectivities. Deleuze’s polemic of subjectivity (the refusal of the Cartesian subject and the transcendental ego of Husserl) – long attempted by other thinkers – is unique precisely because it heralds the dawning of a new species of objecthood that will qualify as its own peculiar subjectivity. A survey of Deleuze’s early work on subjectivity, Empirisme et subjectivité (Deleuze 1953), Le Bergsonisme (Deleuze 1968), and Logique du sens (Deleuze 1969), brings the architectural reader into a peculiar confrontation with what Deleuze calls the ‘new transcendental field’, the field of subjectproducing effects, which for the philosopher takes the place of both the classical and modern subject. Deleuze’s theory of consciousness and perception is premised on the critique of Husserlian phenomenology; and ipso facto his question is an architectural problematic, even if the name ‘architecture’ is not invoked...
Resumo:
National flag carriers are struggling for survival, not only due to classical reasons such as increase in fuel and tax or natural disasters, but largely due to the inability to quickly adapt to its competitive environment – the emergence of budget and Persian Gulf airlines. In this research, we investigate how airlines can transform their business models via technological and strategic capabilities to become profitable and sustainable passenger experience companies. To formulate recommendations, we analyze customer sentiments via social media to understand what people are saying about the airlines.