224 resultados para Pseudo-Riemannian metric
Resumo:
The validity of the Multidimensional School Anger Inventory (MSAI) was examined with adolescents from 5 Pacific Rim countries (N ¼ 3,181 adolescents; age, M ¼ 14.8 years; 52% females). Confirmatory factor analyses examined configural invariance for the MSAI’s anger experience, hostility, destructive expression, and anger coping subscales. The model did not converge for Peruvian students. Using the top 4 loaded items for anger experience, hostility, and destructive expression configural invariance and partial metric and scalar invariances were found. Latent means analysis compared mean responses on each subscale to the U.S. sample. Students from other countries showed higher mean responses on the anger experience subscale (ds ¼ .37–.73). Australian (d ¼ .40) and Japanese students (d ¼ .21) had significantly higher mean hostility subscale scores. Australian students had higher mean scores on the destructive expression subscale (d ¼ .30), whereas Japanese students had lower mean scores (d ¼ 2.17). The largest latent mean gender differences (females lower than males) were for destructive expression among Australian (d ¼ 2.67), Guatemalan (d ¼ 2.42), and U.S. (d ¼ 2.66) students. This study supported an abbreviated, 12-item MSAI with partial invariance. Implications for the use of the MSAI in comparative research are discussed.
Resumo:
This research project explores how interdisciplinary art practices can provide ways for questioning and envisaging alternative modes of coexistence between humans and the non-humans who together, make up the environment. As a practiceled project, it combines a body of creative work (50%) and this exegesis (50%). My interdisciplinary artistic practice appropriates methods and processes from science and engineering and merges them into artistic contexts for critical and poetic ends. By blending pseudo-scientific experimentation with creative strategies like visual fiction, humour, absurd public performance and scripted audience participation, my work engages with a range of debates around ecology. This exegesis details the interplay between critical theory relating to these debates, the work of other creative practitioners and my own evolving artistic practice. Through utilising methods and processes drawn from my prior career in water engineering, I present an interdisciplinary synthesis that seeks to promote improved understandings of the causes and consequences of our ecological actions and inactions.
Resumo:
Denial-of-service (DoS) attacks are a growing concern to networked services like the Internet. In recent years, major Internet e-commerce and government sites have been disabled due to various DoS attacks. A common form of DoS attack is a resource depletion attack, in which an attacker tries to overload the server's resources, such as memory or computational power, rendering the server unable to service honest clients. A promising way to deal with this problem is for a defending server to identify and segregate malicious traffic as earlier as possible. Client puzzles, also known as proofs of work, have been shown to be a promising tool to thwart DoS attacks in network protocols, particularly in authentication protocols. In this thesis, we design efficient client puzzles and propose a stronger security model to analyse client puzzles. We revisit a few key establishment protocols to analyse their DoS resilient properties and strengthen them using existing and novel techniques. Our contributions in the thesis are manifold. We propose an efficient client puzzle that enjoys its security in the standard model under new computational assumptions. Assuming the presence of powerful DoS attackers, we find a weakness in the most recent security model proposed to analyse client puzzles and this study leads us to introduce a better security model for analysing client puzzles. We demonstrate the utility of our new security definitions by including two hash based stronger client puzzles. We also show that using stronger client puzzles any protocol can be converted into a provably secure DoS resilient key exchange protocol. In other contributions, we analyse DoS resilient properties of network protocols such as Just Fast Keying (JFK) and Transport Layer Security (TLS). In the JFK protocol, we identify a new DoS attack by applying Meadows' cost based framework to analyse DoS resilient properties. We also prove that the original security claim of JFK does not hold. Then we combine an existing technique to reduce the server cost and prove that the new variant of JFK achieves perfect forward secrecy (the property not achieved by original JFK protocol) and secure under the original security assumptions of JFK. Finally, we introduce a novel cost shifting technique which reduces the computation cost of the server significantly and employ the technique in the most important network protocol, TLS, to analyse the security of the resultant protocol. We also observe that the cost shifting technique can be incorporated in any Diffine{Hellman based key exchange protocol to reduce the Diffie{Hellman exponential cost of a party by one multiplication and one addition.
Resumo:
Due to their small collecting volume diodes are commonly used in small field dosimetry. However the relative sensitivity of a diode increases with decreasing small field size. Conversely, small air gaps have been shown to cause a significant decrease in the sensitivity of a detector as the field size is decreased. Therefore this study uses Monte Carlo simulations to look at introducing air upstream to diodes such that they measure with a constant sensitivity across all field sizes in small field dosimetry. Varying thicknesses of air were introduced onto the upstream end of two commercial diodes (PTW 60016 photon diode and PTW 60017 electron diode), as well as a theoretical unenclosed silicon chip using field sizes as small as 5 mm × 5 mm . The metric D_(w,Q)/D_(Det,Q) used in this study represents the ratio of the dose to a point of water to the dose to the diode active volume, for a particular field size and location. The optimal thickness of air required to provide a constant sensitivity across all small field sizes was found by plotting D_(w,Q)/D_(Det,Q) as a function of introduced air gap size for various field sizes, and finding the intersection point of these plots. That is, the point at which D_(w,Q)/D_(Det,Q) was constant for all field sizes was found. The optimal thickness of air was calculated to be 3.3 mm, 1.15 mm and 0.10 mm for the photon diode, electron diode and unenclosed silicon chip respectively. The variation in these results was due to the different design of each detector. When calculated with the new diode design incorporating the upstream air gap, k_(Q_clin 〖,Q〗_msr)^(f_clin 〖,f〗_msr ) was equal to unity to within statistical uncertainty (0.5 %) for all three diodes. Cross-axis profile measurements were also improved with the new detector design. The upstream air gap could be implanted on the commercial diodes via a cap consisting of the air cavity surrounded by water equivalent material. The results for the unclosed silicon chip show that an ideal small field dosimetry diode could be created by using a silicon chip with a small amount of air above it.
Resumo:
This thesis highlights the limitations of the existing car following models to emulate driver behaviour for safety study purposes. It also compares the capabilities of the mainstream car following models emulating driver behaviour precise parameters such as headways and Time to Collisions. The comparison evaluates the robustness of each car following model for safety metric reproductions. A new car following model, based on the personal space concept and fish school model is proposed to simulate more precise traffic metrics. This new model is capable of reflecting changes in the headway distribution after imposing the speed limit form VSL systems. This research facilitates assessing Intelligent Transportation Systems on motorways, using microscopic simulation.
Resumo:
This paper presents a mapping and navigation system for a mobile robot, which uses vision as its sole sensor modality. The system enables the robot to navigate autonomously, plan paths and avoid obstacles using a vision based topometric map of its environment. The map consists of a globally-consistent pose-graph with a local 3D point cloud attached to each of its nodes. These point clouds are used for direction independent loop closure and to dynamically generate 2D metric maps for locally optimal path planning. Using this locally semi-continuous metric space, the robot performs shortest path planning instead of following the nodes of the graph --- as is done with most other vision-only navigation approaches. The system exploits the local accuracy of visual odometry in creating local metric maps, and uses pose graph SLAM, visual appearance-based place recognition and point clouds registration to create the topometric map. The ability of the framework to sustain vision-only navigation is validated experimentally, and the system is provided as open-source software.
Resumo:
Many successful query expansion techniques ignore information about the term dependencies that exist within natural language. However, researchers have recently demonstrated that consistent and significant improvements in retrieval effectiveness can be achieved by explicitly modelling term dependencies within the query expansion process. This has created an increased interest in dependency-based models. State-of-the-art dependency-based approaches primarily model term associations known within structural linguistics as syntagmatic associations, which are formed when terms co-occur together more often than by chance. However, structural linguistics proposes that the meaning of a word is also dependent on its paradigmatic associations, which are formed between words that can substitute for each other without effecting the acceptability of a sentence. Given the reliance on word meanings when a user formulates their query, our approach takes the novel step of modelling both syntagmatic and paradigmatic associations within the query expansion process based on the (pseudo) relevant documents returned in web search. The results demonstrate that this approach can provide significant improvements in web re- trieval effectiveness when compared to a strong benchmark retrieval system.
Resumo:
Stereo visual odometry has received little investigation in high altitude applications due to the generally poor performance of rigid stereo rigs at extremely small baseline-to-depth ratios. Without additional sensing, metric scale is considered lost and odometry is seen as effective only for monocular perspectives. This paper presents a novel modification to stereo based visual odometry that allows accurate, metric pose estimation from high altitudes, even in the presence of poor calibration and without additional sensor inputs. By relaxing the (typically fixed) stereo transform during bundle adjustment and reducing the dependence on the fixed geometry for triangulation, metrically scaled visual odometry can be obtained in situations where high altitude and structural deformation from vibration would cause traditional algorithms to fail. This is achieved through the use of a novel constrained bundle adjustment routine and accurately scaled pose initializer. We present visual odometry results demonstrating the technique on a short-baseline stereo pair inside a fixed-wing UAV flying at significant height (~30-100m).
Resumo:
Constructing train schedules is vital in railways. This complex and time consuming task is however made more difficult by additional requirements to make train schedules robust to delays and other disruptions. For a timetable to be regarded as robust, it should be insensitive to delays of a specified level and its performance with respect to a given metric, should be within given tolerances. In other words the effect of delays should be identifiable and should be shown to be minimal. To this end, a sensitivity analysis is proposed that identifies affected operations. More specifically a sensitivity analysis for determining what operation delays cause each operation to be affected is proposed. The information provided by this analysis gives another measure of timetable robustness and also provides control information that can be used when delays occur in practice. Several algorithms are proposed to identify this information and they utilise a disjunctive graph model of train operations. Upon completion the sets of affected operations can also be used to define the impact of all delays without further disjunctive graph evaluations.
Resumo:
Background Aphasia is an acquired language disorder that can present a significant barrier to patient involvement in healthcare decisions. Speech-language pathologists (SLPs) are viewed as experts in the field of communication. However, many SLP students do not receive practical training in techniques to communicate with people with aphasia (PWA) until they encounter PWA during clinical education placements. Methods This study investigated the confidence and knowledge of SLP students in communicating with PWA prior to clinical placements using a customised questionnaire. Confidence in communicating with people with aphasia was assessed using a 100-point visual analogue scale. Linear, and logistic, regressions were used to examine the association between confidence and age, as well as confidence and course type (graduate-entry masters or undergraduate), respectively. Knowledge of strategies to assist communication with PWA was examined by asking respondents to list specific strategies that could assist communication with PWA. Results SLP students were not confident with the prospect of communicating with PWA; reporting a median 29-points (inter-quartile range 17–47) on the visual analogue confidence scale. Only, four (8.2%) of respondents rated their confidence greater than 55 (out of 100). Regression analyses indicated no relationship existed between confidence and students‘ age (p = 0.31, r-squared = 0.02), or confidence and course type (p = 0.22, pseudo r-squared = 0.03). Students displayed limited knowledge about communication strategies. Thematic analysis of strategies revealed four overarching themes; Physical, Verbal Communication, Visual Information and Environmental Changes. While most students identified potential use of resources (such as images and written information), fewer students identified strategies to alter their verbal communication (such as reduced speech rate). Conclusions SLP students who had received aphasia related theoretical coursework, but not commenced clinical placements with PWA, were not confident in their ability to communicate with PWA. Students may benefit from an educational intervention or curriculum modification to incorporate practical training in effective strategies to communicate with PWA, before they encounter PWA in clinical settings. Ensuring students have confidence and knowledge of potential communication strategies to assist communication with PWA may allow them to focus their learning experiences in more specific clinical domains, such as clinical reasoning, rather than building foundation interpersonal communication skills.
In the blink of an eye : the circadian effects on ocular and subjective indices of driver sleepiness
Resumo:
Driver sleepiness contributes substantially to fatal and severe crashes and the contribution it makes to less serious crashes is likely to as great or greater. Currently, drivers’ awareness of sleepiness (subjective sleepiness) remains a critical component for the mitigation of sleep-related crashes. Nonetheless, numerous calls have been made for technological monitors of drivers’ physiological sleepiness levels so drivers can be ‘alerted’ when approaching high levels of sleepiness. Several physiological indices of sleepiness show potential as a reliable metric to monitor drivers’ sleepiness levels, with eye blink indices being a promising candidate. However, extensive evaluations of eye blink measures are lacking including the effects that the endogenous circadian rhythm can have on eye blinks. To examine the utility of ocular measures, 26 participants completed a simulated driving task while physiological measures of blink rate and duration were recorded after partial sleep restriction. To examine the circadian effects participants were randomly assigned to complete either a morning or an afternoon session of the driving task. The results show subjective sleepiness levels increased over the duration of the task. The blink duration index was sensitive to increases in sleepiness during morning testing, but was not sensitive during afternoon testing. This finding suggests that the utility of blink indices as a reliable metric for sleepiness are still far from specific. The subjective measures had the largest effect size when compared to the blink measures. Therefore, awareness of sleepiness still remains a critical factor for driver sleepiness and the mitigation of sleep-related crashes.
Resumo:
Increasing global competition, rapid technological changes, advances in manufacturing and information technology and discerning customers are forcing supply chains to adopt improvement practices that enable them to deliver high quality products at a lower cost and in a shorter period of time. A lean initiative is one of the most effective approaches toward achieving this goal. In the lean improvement process, it is critical to measure current and desired performance level in order to clearly evaluate the lean implementation efforts. Many attempts have tried to measure supply chain performance incorporating both quantitative and qualitative measures but failed to provide an effective method of measuring improvements in performances for dynamic lean supply chain situations. Therefore, the necessity of appropriate measurement of lean supply chain performance has become imperative. There are many lean tools available for supply chains; however, effectiveness of a lean tool depends on the type of the product and supply chain. One tool may be highly effective for a supply chain involved in high volume products but may not be effective for low volume products. There is currently no systematic methodology available for selecting appropriate lean strategies based on the type of supply chain and market strategy This thesis develops an effective method to measure the performance of supply chain consisting of both quantitative and qualitative metrics and investigates the effects of product types and lean tool selection on the supply chain performance Supply chain performance matrices and the effects of various lean tools over performance metrics mentioned in the SCOR framework have been investigated. A lean supply chain model based on the SCOR metric framework is then developed where non- lean and lean as well as quantitative and qualitative metrics are incorporated in appropriate metrics. The values of appropriate metrics are converted into triangular fuzzy numbers using similarity rules and heuristic methods. Data have been collected from an apparel manufacturing company for multiple supply chain products and then a fuzzy based method is applied to measure the performance improvements in supply chains. Using the fuzzy TOPSIS method, which chooses an optimum alternative to maximise similarities with positive ideal solutions and to minimise similarities with negative ideal solutions, the performances of lean and non- lean supply chain situations for three different apparel products have been evaluated. To address the research questions related to effective performance evaluation method and the effects of lean tools over different types of supply chains; a conceptual framework and two hypotheses are investigated. Empirical results show that implementation of lean tools have significant effects over performance improvements in terms of time, quality and flexibility. Fuzzy TOPSIS based method developed is able to integrate multiple supply chain matrices onto a single performance measure while lean supply chain model incorporates qualitative and quantitative metrics. It can therefore effectively measure the improvements for supply chain after implementing lean tools. It is demonstrated that product types involved in the supply chain and ability to select right lean tools have significant effect on lean supply chain performance. Future study can conduct multiple case studies in different contexts.
Resumo:
Organo Arizona SAz-2 Ca-montmorillonite was prepared with different surfactant (DDTMA and HDTMA) loadings through direct ion exchange. The structural properties of the prepared organoclays were characterized by XRD and BET instruments. Batch experiments were carried out on the adsorption of bisphenol A (BPA) under different experimental conditions of pH and temperature to determine the optimum adsorption conditions. The hydrophobic phase and positively charged surface created by the loaded surfactant molecules are responsible for the adsorption of BPA. The adsorption of BPA onto organoclays is well described by pseudo-second order kinetic model and the Langmuir isotherm. The maximum adsorption capacity of the organoclays for BPA obtained from a Langmuir isotherm was 151.52 mg/g at 297 K. This value is among the highest values for BPA adsorption compared with other adsorbents. In addition, the adsorption process was spontaneous and exothermic based on the adsorption thermodynamics study. The organoclays intercalated with longer chain surfactant molecules possessed a greater adsorption capacity for BPA even under alkaline conditions. This process provides a pathway for the removal of BPA from contaminated waters.
Resumo:
Because of increased competition between healthcare providers, higher customer expectations, stringent checks on insurance payments and new government regulations, it has become vital for healthcare organisations to enhance the quality of the care they provide, to increase efficiency, and to improve the cost effectiveness of their services. Consequently, a number of quality management concepts and tools are employed in the healthcare domain to achieve the most efficient ways of using time, manpower, space and other resources. Emergency departments are designed to provide a high-quality medical service with immediate availability of resources to those in need of emergency care. The challenge of maintaining a smooth flow of patients in emergency departments is a global problem. This study attempts to improve the patient flow in emergency departments by considering Lean techniques and Six Sigma methodology in a comprehensive conceptual framework. The proposed research will develop a systematic approach through integration of Lean techniques with Six Sigma methodology to improve patient flow in emergency departments. The results reported in this paper are based on a standard questionnaire survey of 350 patients in the Emergency Department of Aseer Central Hospital in Saudi Arabia. The results of the study led us to determine the most significant variables affecting patient satisfaction with patient flow, including waiting time during patient treatment in the emergency department; effectiveness of the system when dealing with the patient’s complaints; and the layout of the emergency department. The proposed model will be developed within a performance evaluation metric based on these critical variables, to be evaluated in future work within fuzzy logic for continuous quality improvement.
Predicting invasion in grassland ecosystems: is exotic dominance the real embarrassment of richness?
Resumo:
Invasions have increased the size of regional species pools, but are typically assumed to reduce native diversity. However, global-scale tests of this assumption have been elusive because of the focus on exotic species richness, rather than relative abundance. This is problematic because low invader richness can indicate invasion resistance by the native community or, alternatively, dominance by a single exotic species. Here, we used a globally replicated study to quantify relationships between exotic richness and abundance in grass-dominated ecosystems in 13 countries on six continents, ranging from salt marshes to alpine tundra. We tested effects of human land use, native community diversity, herbivore pressure, and nutrient limitation on exotic plant dominance. Despite its widespread use, exotic richness was a poor proxy for exotic dominance at low exotic richness, because sites that contained few exotic species ranged from relatively pristine (low exotic richness and cover) to almost completely exotic-dominated ones (low exotic richness but high exotic cover). Both exotic cover and richness were predicted by native plant diversity (native grass richness) and land use (distance to cultivation). Although climate was important for predicting both exotic cover and richness, climatic factors predicting cover (precipitation variability) differed from those predicting richness (maximum temperature and mean temperature in the wettest quarter). Herbivory and nutrient limitation did not predict exotic richness or cover. Exotic dominance was greatest in areas with low native grass richness at the site- or regional-scale. Although this could reflect native grass displacement, a lack of biotic resistance is a more likely explanation, given that grasses comprise the most aggressive invaders. These findings underscore the need to move beyond richness as a surrogate for the extent of invasion, because this metric confounds monodominance with invasion resistance. Monitoring species' relative abundance will more rapidly advance our understanding of invasions.